Google promised to use artificial intelligence to launch other features.
Google has announced cutting-edge ways to use artificial intelligence to make information exploration through its search engine more natural and intuitive.
![]() |
New features launched by Google to use artificial intelligence in its search engine to meet the requirements of its users |
Where Google said: “Our products at Google have a unique goal, which is to be as useful to you as possible, whether in moments big or small. We have always believed in the power of artificial intelligence to enable us to achieve this goal.”
She also said that since the early days of the search engine, AI has helped her understand the language, and make the results more useful. Over long periods of time, Google has developed its investment in artificial intelligence, which now makes it able to understand and collect information in its various forms, such as: language understanding, image understanding, and video understanding.
And Google added that cameras have become a powerful way to understand and explore the world around us, so that the Lens service is now used more than 10 billion times a month, as people use it to search for what they see through the camera and in the pictures.
And after Google allowed users to take advantage of the Lens service to search using the camera or in images directly from the search bar, users will be able to search Google in the near future by searching for what is on the screen of Android phones.
The company said that by using this technology, people will be able to search for what they see in photos or videos across websites or applications that they use, such as messaging and video applications, without having to leave the application.
This means that, for example, “suppose your friend sends you a message containing a video of him exploring Paris. If you want to know more about the tourist attraction you see in the background, you can simply press and hold the power button or the home button on the Android phone that calls up the Google Assistant, then tap on 'screen search'. You recognize Linz as the Luxembourg Palace, which is where you can click to find out more.”
The technology giant, Google, had previously launched the multiple search feature that allows people to search using the image and text at the same time, and today it announced the availability of the feature globally on smartphones, in all languages and countries in which the Lens service is available.
And Google said that it had recently developed the multi-search feature to now allow searching for things in the vicinity, as the user can take a picture and then type the phrase “near me” to find what he needs, whether he is seeking to support nearby businesses or just seeks to find something on hurry. This feature is currently available in English in the United States, and in the coming months, it will launch globally.
Google also said that in the next few months, people will be able to use the multi-search feature globally on any image they see on the search results page on phones.
Google explained this by saying: “For example, you might search for 'modern living room ideas' and see a coffee table you like, but prefer it in a different shape - say it's rectangular instead of circular - then you can use the multiple search to add the word 'rectangular' to find the style that you like search for him".
إرسال تعليق