- The IOS application of Perplexity is updated with a reworked vocal mode
- The upgrade adds six new voices and real -time research integration
- Perplexity also included new personalization features and a new design to the iOS application
AI Conversational Search Engine Perplexity is expressed in its last iOS update. The vocal mode of AI Chatbot brings a new look and more natural voices to the application, as well as new interactive features. The upgrade implements the applexity application to better challenge competitors with their own voice options like Chatgpt or Google Gemini.
Before this update, the vocal function of Perplexity was somewhat limited. He could read answers aloud but without too much emotion and with a kind of walkie-high interface that has slowed things down. Perplexity has now added six different voices. Although it is always a vocal text system, which means that it will not have the emotional shade of the advanced vocal mode of chatgpt, the improvement is perceptible. You can finally choose a voice that does not look like an audio book narrator from 2005.
So how does the vocal mode of Perplexity accumulate against competition? According to a non -scientific comparison, I would say that the advanced vocal mode of Chatgpt is gaining in pure realism, with expressive sound, a conversational tone and an surprisingly natural laugh and pure and simple interruptions. Google Gemini is a little less fluid, but always very natural overall. And although Google Gemini’s voice is very good, it is a little less fluid than Chatgpt. Perplexity’s offer is very clear and easy to understand, but its voices persist in the more neutral tone which seems a little more artificial. It is not negative, however, just a different approach. Instead of focusing on the fact that human sounds sound, it is coupled with usefulness and ensuring that when you ask a question, you get a answer not only, but also the sources to save it.
Perplexity’s vocal mode is also integrated into other AI features. This means that the real -time research tool is linked to vocal mode. When you ask a question, you don’t just get a spoken answer, you also see live search results, with links to sources. It is an ability that is crucial, because a large part of the appeal of perplexity is in the way it merges AI with research capacities.
Resolved perplexity
The way you start the vocal mode and the appearance of the application when using the function have also been modified. The microphone icon that you have to start talking has been replaced by a sphere of changing points that respond to your voice and contact, the dissemination and reform of your contact. It is a useless but fun button for the application. You can also customize the application with widgets such as stock ticks or sports score updates. He adds another layer of personalization that makes perplexity a little more like your assistant rather than a simple generic chatbot. These types of options will probably be necessary for the perplexity of following and perhaps beating other AI chatbots.
This ambition is also obvious in the other major upgrade of the application. Perplexity also added the new Sonnet Claude 3.7 model to its range. The new anthropic model aims to improve Perplexity’s ability to answer complex questions or several steps with capacity. Claude 3.7 is still very new and the criticisms have not been unanimous, but this could exceed or at least correspond to the models used by Chatgpt and Google Gemini for reasoning and conversational commitment.
The overhaul of the vocal mode of Perplexity suggests that perplexity does not seek to beat Chatgpt and Gemini where they are the strongest, but to increase its own forces with features that make all the interaction (and sound) smoother, more immersive and more natural.