Apple’s new features are a frail attempt to reprint Google lens as the tech giant decided to keep the services rather limited.
According to Apple’s worldwide developer’s conference, Apple is going to launch two new features accompanying the new update, namely Live text and Visual Look Up. These might ring a bell as we were reminded of the infamous Google lens when locating these.
The iOS 15 will come by harboring both these new features that are quite promising as newly launched items, however still not up to par with the 'Apple standards.' Apple's attempt to incorporate the goods featured in the Google Lens was pretty evident through these inventions. The question here arises, would Apple be able to fall even in the shadows of Google lens with its new update, or would this just be a part of the fleeting trends?
The first mention, 'Live text' would enable users to copy or turn handwritten or real documents into digital texts. Apple claims to be rather distinctive from Google in the internal mesh by favoring the use of neural networks instead of the usual cloud-based approach adopted by many. The use is rather simple, the viewfinder located at the bottom-right corner of the camera would initiate the feature. You would then place your finger on top of the text, using the traditional Apple text gestures to mark the text. This would allow you to copy the text to the desired place.
Moreover, Apple guarantees the function to work on old or preexisting pictures as well however the result is not claimed to be as accurate or effective. The only downside to the feature is while Google Lens can understand up to 100 different languages, Live text can only comprehend 7 different ones. Google might have the perk of being launched way before however a 100 and a 7? The ratio is impossible.
Next comes the Visual Lookup. Being one of the most promising features of the new iOS update, Visual Lookup claims to look up intel on the pictures you have captured. It could be the origin of a book, the type of fabric of a cloth, or the recipe of a dish, however before we get too ahead of ourselves, Visual Lookup can only access information on 5 basic niches.
The depths of the knowledge acquired by the feature are also not set at the moment which leads to a lot of questions on the credibility of the feature. When in opposition to a search giant like Google Lens, such a low treasury is bound to create chaos.
While many are still skeptical about these new updates, we think this might be the first step to the fantasy of augmented reality and the infamous 'Apple Glasses.' To see whether it is all talk or the features have a promising future, we will have to wait till Mid-September when the new update will be released.
Stay tuned to find out more about the infamous ‘Apple vs. Google’ battle.
Read next: Apple updates its Mail and Safari policies as well as Apple Privacy Report to ensure a better and more secured experience for its users online through its devices
According to Apple’s worldwide developer’s conference, Apple is going to launch two new features accompanying the new update, namely Live text and Visual Look Up. These might ring a bell as we were reminded of the infamous Google lens when locating these.
The iOS 15 will come by harboring both these new features that are quite promising as newly launched items, however still not up to par with the 'Apple standards.' Apple's attempt to incorporate the goods featured in the Google Lens was pretty evident through these inventions. The question here arises, would Apple be able to fall even in the shadows of Google lens with its new update, or would this just be a part of the fleeting trends?
The first mention, 'Live text' would enable users to copy or turn handwritten or real documents into digital texts. Apple claims to be rather distinctive from Google in the internal mesh by favoring the use of neural networks instead of the usual cloud-based approach adopted by many. The use is rather simple, the viewfinder located at the bottom-right corner of the camera would initiate the feature. You would then place your finger on top of the text, using the traditional Apple text gestures to mark the text. This would allow you to copy the text to the desired place.
Moreover, Apple guarantees the function to work on old or preexisting pictures as well however the result is not claimed to be as accurate or effective. The only downside to the feature is while Google Lens can understand up to 100 different languages, Live text can only comprehend 7 different ones. Google might have the perk of being launched way before however a 100 and a 7? The ratio is impossible.
Next comes the Visual Lookup. Being one of the most promising features of the new iOS update, Visual Lookup claims to look up intel on the pictures you have captured. It could be the origin of a book, the type of fabric of a cloth, or the recipe of a dish, however before we get too ahead of ourselves, Visual Lookup can only access information on 5 basic niches.
The depths of the knowledge acquired by the feature are also not set at the moment which leads to a lot of questions on the credibility of the feature. When in opposition to a search giant like Google Lens, such a low treasury is bound to create chaos.
While many are still skeptical about these new updates, we think this might be the first step to the fantasy of augmented reality and the infamous 'Apple Glasses.' To see whether it is all talk or the features have a promising future, we will have to wait till Mid-September when the new update will be released.
Stay tuned to find out more about the infamous ‘Apple vs. Google’ battle.
Read next: Apple updates its Mail and Safari policies as well as Apple Privacy Report to ensure a better and more secured experience for its users online through its devices