OpenAI's next text-to-image tool DALL·E 3 launches October

OpenAI’s next text-to-image tool DALL·E 3 launches October

Source Node: 2899099

AI in brief OpenAI will release the latest version of its text-to-image tool DALL·E in October.

Unlike previous generations, DALL·E 3 has been integrated with OpenAI's text-only generation tool ChatGPT to help craft more detailed input descriptions or prompts.

The quality of images produced by the software is dependent on the level of detail of prompts. There are all sorts of tricks and keywords to get AI to make the exact type of image you're thinking of, a skill described as prompt engineering. By connecting ChatGPT to DALL·E 3, users can refine their descriptions of the images more easily.

"When prompted with an idea, ChatGPT will automatically generate tailored, detailed prompts for DALL·E 3 that bring your idea to life. If you like a particular image, but it's not quite right, you can ask ChatGPT to make tweaks with just a few words," OpenAI explained.

ChatGPT Plus and Enterprise customers will have access to DALL·E 3 next month. OpenAI said it has improved its content filters to prevent users generating fake images of public figures, and will try to not produce digital artwork that directly copies the style of a living artist.

Microsoft's GitHub Copilot Chat is here for lone developers

GitHub has released its AI Copilot Chat code generator software for individual developers within Microsoft's integrated development environments, Visual Studio and VS Code.

Copilot Chat was previously only available for public beta testing for its enterprise users. The new software expands Copilot's code generation abilities. In the previous version, Copilot was more of a pair programmer tool that allowed developers to autocomplete the line of code they were writing by selecting its recommendations.

With Copilot Chat, however, programmers can describe a specific block of code, whether it's for a loop or function, and it will try to write it from scratch. They can then copy and paste it in Visual Studio or VS Code and tweak and adjust it as necessary. Copilot Chat can also read code too to explain how it works, what bugs it might have, and how to fix them.

GitHub believes it will help developers across all skill levels be more productive.

"Integrated together, GitHub Copilot Chat and the GitHub Copilot pair programmer form a powerful AI assistant capable of helping every developer build at the speed of their minds in the natural language of their choice. We believe this cohesion will form the new centerpiece of the software development experience, fundamentally reducing boilerplate work and designating natural language as a new universal programming language for every developer on the planet," it said.

Amazon rolling out improved Alexa based on large language model

Amazon is updating its voice-activated AI assistant Alexa with a custom-built large language model.

LLMs are generally quite open ended, and it's difficult to predict and control their responses. But computer scientists have figured out new methods to shape their overall behavior to prevent them going off the rails. Now Amazon has found a way to integrate LLMs with Alexa.

It said its new system has been "custom-built and specifically optimized for voice interactions," and it still performs the same actions as before like controlling smart home devices, answering questions, or providing the latest news and updates. But now the LLM-enabled Alexa will be more conversational and flexible in its responses.

"You don't want a rote, robotic companion in your home," Daniel Rausch, Vice President, Alexa and Fire TV, said in a blog post. "As we've always said, the most boring dinner party is one where nobody has an opinion – and, with this new LLM, Alexa will have a point of view, making conversations more engaging.

"Alexa can tell you which movies should have won an Oscar, celebrate with you when you answer a quiz question correctly, or write an enthusiastic note for you to send to congratulate a friend on their recent graduation."

The LLM isn't perfect, however, and Alexa may still generate inaccurate responses that may not be useful, but the latest system does have a less robotic voice than before. You can listen to what the old and new Alexa sounds like here. ®

Time Stamp:

More from The Register