Google’s AI Future for Home: Stunning!

Google is making several announcements today before its significant Pixel event next Tuesday. Besides revealing the new Nest Learning Thermostat and the Google TV Streamer, Google is also offering a glimpse of some substantial Google Home and Google Assistant alterations. And all of them are truly remarkable.

Let’s commence with the Google Assistant. Google has disclosed a fresh voice for the Assistant, and it sounds strikingly more natural compared to the existing one. It’s challenging to depict in writing, but the essence is that the Assistant’s voice now resembles a human’s more and a robot’s less. The Assistant takes natural pauses while speaking and has inflections in its tone.

Additionally, the Google Assistant is becoming better and more natural when handling follow-up questions. In a demonstration video I witnessed, someone asks the Google Assistant whether Pluto is still a planet. The Assistant clarifies that it isn’t and that the International Astronomical Union (the IAU) decided to reclassify Pluto as a dwarf planet. Then, the person simply asks, “Could they change their minds again?” The Assistant understands that “they” refer to the IAU and that the person is inquiring if the organization could alter its decision regarding Pluto being a dwarf planet.

As fascinating as all this is, the truly thrilling aspects pertain to Google Home. Google displayed its plans for integrating Gemini into the Google Home experience, and even as someone who hasn’t been particularly impressed by existing Gemini features, the additions it’s making to Google Home are astonishingly impressive.

My favorite Gemini feature is how you can utilize it to create automations. Automations are a crucial element of any smart home, but they’re also not particularly effortless to set up. Having the lights automatically switch on when you arrive home is excellent, but configuring that yourself can be more complicated than it sounds.

With Gemini, you’ll be capable of creating automations by merely stating or writing what you desire your automation to accomplish. In an example, Google showcases someone using Gemini in the Google Home app and saying, “Help the kids remember to put their bikes in the garage when they come home from school.” Based on this, Gemini creates an automation that will turn on the garage lights and broadcast a message with a reminder to put away bikes whenever someone arrives home between 3:30 p.m. and 5 p.m. You can then tap a button to view the complete automation process and customize it if you wish. Otherwise, you tap another button to save it, and that’s all there is to it.

Gemini is also going to make searching your camera activity a great deal easier. Using the same bike example, you could go to the Activity page in Google Home and search, “Did the kids leave their bikes in the driveway?” You then obtain a clear answer at the top, followed by video clips Gemini pulled for its answer. It might sound straightforward when explained this way, but the technical process occurring behind the scenes to make this appear so seamless is truly astonishing.

This is all feasible because of how Gemini will significantly enhance the quality and detail of what your smart-home cameras perceive. For instance, as of now, a Nest camera observing your backyard can provide you with an alert if it spots a bird on your bird feeder and recognizes it as an animal. With Gemini, however, it could offer a much more in-depth description of the scene, such as:

“A blue jay at a seed-filled feeder. Its blue and white feathers vibrant against a dull, wintry backdrop. There are no people or vehicles, just tranquil natural scenery and the colorful bird.”

While it remains to be seen how all of this operates in the real world compared to pre-rendered demos in a press briefing, everything Google is presenting here seems incredible. It often feels like Google announces Gemini features without a clear elucidation of how they’re supposed to simplify your life, but that’s not the situation here. Using Gemini to create automations is ingenious and something I’m eager to try. The upgraded Google Assistant sounds wonderful. The new AI tools for Nest cameras are like something directly from the future.

Now, the significant question: When can you avail yourself of all these features? Google indicates that it’ll commence rolling everything out to Nest Aware subscribers in a Public Preview phase later this year. The precise timing is uncertain, but I surely hope it’s sooner rather than later. Google is onto something extraordinary here, and I can’t wait to have access to all of it.

  • mayask

    Related Posts

    ChatGPT’s new Canvas feature like Claude’s Artifacts vividly

    img { max-width: 100%; } OpenAI Following closely on the heels of its whopping $6.6 billion funding round, OpenAI on Thursday made the beta of a brand-new collaboration interface for…

    OpenAI raises $6.6B in latest funding round

    Andrew Martonik / Digital Trends OpenAI has now emerged as one of the wealthiest private companies on Earth after successfully securing a whopping $6.6 billion in its latest funding round…

    You Missed

    New Avatar: The Last Airbender game looks super ambitious

    • By mvayask
    • October 5, 2024
    • 41 views

    PS5 colorful chrome accessories pre-order now

    • By mvayask
    • October 5, 2024
    • 39 views
    PS5 colorful chrome accessories pre-order now

    ChatGPT’s new Canvas feature like Claude’s Artifacts vividly

    • By mayask
    • October 5, 2024
    • 40 views
    ChatGPT’s new Canvas feature like Claude’s Artifacts vividly

    OpenAI raises $6.6B in latest funding round

    • By mayask
    • October 5, 2024
    • 45 views
    OpenAI raises $6.6B in latest funding round

    Qualcomm aims to add cool AI tools to Android phone

    • By mayask
    • October 5, 2024
    • 40 views
    Qualcomm aims to add cool AI tools to Android phone

    Reddit in $60M deal with Google for AI tools boost

    • By mayask
    • October 5, 2024
    • 39 views