Tuesday, February 7th 2023

Google Prepares ChatGPT Alternative Called Bard AI

OpenAI's ChatGTP has reportedly reached an astonishing 100 million monthly active users in the heating wars of AI. This figure is achieved after a few months of availability, and big tech companies are pressured to respond. Today, we have information that Google will release a model similar to ChatGTP called Bard AI. Based on Language Model for Dialogue Applications (LaMDA) that Google introduced over two years ago, the Bard AI solution will integrate with Google search to access the latest information around the web. Currently in preview for private testers, Bard AI will roll out to the public in the coming weeks as the demand for Large Language Models that are in chat format soars.
Google CEO Sundar PichaiBard seeks to combine the breadth of the world's knowledge with the power, intelligence and creativity of our large language models. It draws on information from the web to provide fresh, high-quality responses. Bard can be an outlet for creativity, and a launchpad for curiosity, helping you to explain new discoveries from NASA's James Webb Space Telescope to a 9-year-old, or learn more about the best strikers in football right now, and then get drills to build your skills.
Source: Google
Add your own comment

52 Comments on Google Prepares ChatGPT Alternative Called Bard AI

#26
Bwaze
FrickAlgorithms isn't the same thing.
"Since 2015, Google has been utilizing RankBrain, a machine learning algorithm. It is how it facilitates processing search results and delivering more relevant answers to users. Google uses AI every time a user enters a search query, and the technology is constantly learning and improving."
Posted on Reply
#27
AusWolf
FrickAlgorithms isn't the same thing.
What is AI if not a self-improving algorithm?
Posted on Reply
#28
dgianstefani
TPU Proofreader
Open source AI needed.

Closed source AI reflects politics and bias of creators.

This will become a worse problem than Google monopolising most of the internet, mark my words.

"AI"' has always been a dressed up search engine with some fluff to make them seem smart.

I want to see open source AI that can be downloaded to a personal device. Our smartphones have good machine learning and AI hardware at this point.

I do not trust corporate or government information disseminating software, which is what all of these are.

ChatGPT has extremely political answers, and despite being made by "OpenAI" is entirely closed source.
Posted on Reply
#29
AusWolf
dgianstefaniOpen source AI needed.

Closed source AI reflects politics and bias of creators.

This will become a worse problem than Google monopolising most of the internet, mark my words.

"AI"' has always been a dressed up search engine with some fluff to make them seem smart.

I want to see open source AI that can be downloaded to a personal device. Our smartphones have good machine learning and AI hardware at this point.

I do not trust corporate or government information disseminating software, which is what all of these are.

ChatGPT has extremely political answers, and despite being made by "OpenAI" is entirely closed source.
As long as humankind doesn't start worshipping AI technology as gods with answers to everything, and looks at it for what it is: a gimmick, nothing more, we should be fine. I guess. :(

I mean, political bias has existed way before AI technology, and it penetrates everyday life as we speak. I don't think AI can make it much worse.
Posted on Reply
#30
dgianstefani
TPU Proofreader
AusWolfAs long as humankind doesn't start worshipping AI technology as gods with answers to everything, and looks at it for what it is: a gimmick, nothing more, we should be fine. I guess. :(

I mean, political bias has existed way before AI technology, and it penetrates everyday life as we speak. I don't think AI can make it much worse.
You're wrong on both counts.

AI is by no means a gimmick, and it can make things significantly worse.

AI has the potential to offload all political programming to automated systems, removing human involvement.

We are getting to the point where all information is somewhat tainted, and it will become very hard to question narratives when everything you see, hear and read is a conformal part of information warfare.

I.E. how will you ask the right questions if you're not even aware of the topic kind of situation.

People are already sunk into their echo chambers with social media and site choices, AI is going to make this much worse.

Younger generations are plugged into this information machine as soon as they have a device - critical thinking and information filtering is being actively degraded.
Posted on Reply
#31
AusWolf
dgianstefaniYou're wrong on both counts.

AI is by no means a gimmick, and it can make things significantly worse.

AI has the potential to offload all political programming to automated systems, removing human involvement.

We are getting to the point where all information is somewhat tainted, and it will become very hard to question narratives when everything you see, hear and read is a conformal part of information warfare.

I.E. how will you ask the right questions if you're not even aware of the topic kind of situation.

People are already sunk into their echo chambers with social media and site choices, AI is going to make this much worse.

Younger generations are plugged into this information machine as soon as they have a device - critical thinking and information filtering is being actively degraded.
I see what you mean, but I don't think human involvement has made politics a lot better in the past. We're basically at the same stage as we were 100 years ago, just with AI this time.

Edit: It's scary and bad, I know. All I'm saying is, we've been through similar shit before.
Posted on Reply
#32
Bwaze
We're gonna be OK.


Posted on Reply
#33
DeathtoGnomes
This far into this thread and no one mentions Skynet. Is the Skynet reference still relative? It should be, someone will direct chatGPT to learn to be self aware, just to fight with Googles AI attempt.

I havent had not been able to get some time with ChatGPT, its always too busy.
Posted on Reply
#34
Frick
Fishfaced Nincompoop
Bwaze"Since 2015, Google has been utilizing RankBrain, a machine learning algorithm. It is how it facilitates processing search results and delivering more relevant answers to users. Google uses AI every time a user enters a search query, and the technology is constantly learning and improving."
Fair enough. I'd still say AI could make search good again, if used correctly. It won't happen, but still.
Posted on Reply
#35
Vayra86
AusWolfI see what you mean, but I don't think human involvement has made politics a lot better in the past. We're basically at the same stage as we were 100 years ago, just with AI this time.

Edit: It's scary and bad, I know. All I'm saying is, we've been through similar shit before.
This is new, because now everything is connected. This makes it inescapable whereas if I choose to make my own reality in some distant village 100 years ago, nobody needs to know.

Look at nuclear bombs. Some genies wont get back in bottles and then we are in a new meta ;)
Posted on Reply
#36
AusWolf
Vayra86This is new, because now everything is connected. This makes it inescapable whereas if I choose to make my own reality in some distant village 100 years ago, nobody needs to know.
As long as German/Russian/American armies didn't come marching through, or a bomb didn't destroy your home a couple decades later. With the same analogy, nothing prevents you from living in a distant village nowadays, either. :) Just because the world wants you to connect, you don't have to. Even with a normal job and a normal life, you can (and have to) still filter what you consume and what you spend time to digest.
Posted on Reply
#37
StefanM
Vayra86Ah... Bards.. Those folk of ye olde times that travelled the land in search of tales to sing about.

As we all know, every time information passes from one to another, crucial details and correctness are lost.

They chose a great name.
I would have preferred a first name like Cacofonix
Posted on Reply
#38
Bwaze
DeathtoGnomesThis far into this thread and no one mentions Skynet. Is the Skynet reference still relative? It should be, someone will direct chatGPT to learn to be self aware, just to fight with Googles AI attempt.

I havent had not been able to get some time with ChatGPT, its always too busy.
Referencing Skynet supposes the dangers of AI are that it will become self aware, sentient, and hostile to humans.

We now fear AI might do a lot of damage way before it even achieves this. Purely as a "tool" used in hands of humans, replacing human work, creativity...
Posted on Reply
#39
AusWolf
BwazeReferencing Skynet supposes the dangers of AI are that it will become self aware, sentient, and hostile to humans.

We now fear AI might do a lot of damage way before it even achieves this. Purely as a "tool" used in hands of humans, replacing human work, creativity...
Exactly. Besides, let's not forget the fact that as of now, AI is nothing more than a fancy term for a self-improving algorithm. The AI we see in movies is a vastly different concept. Being able to do the required task through the machine itself learning to do it without much previous programming is not the same as doing tasks that are not required, or being sentient.

Speaking of movies, has anybody seen Her (link)?
Posted on Reply
#40
kiriakost
BBC UK shared a different story.
Microsoft owns ChatGPT, and Google pee it pants about it.
Microsoft search engine has only 3% of searches compared to Google.
Microsoft now becomes mighty, and Google this is unable to fight back.
Posted on Reply
#41
Vayra86
AusWolfAs long as German/Russian/American armies didn't come marching through, or a bomb didn't destroy your home a couple decades later. With the same analogy, nothing prevents you from living in a distant village nowadays, either. :) Just because the world wants you to connect, you don't have to. Even with a normal job and a normal life, you can (and have to) still filter what you consume and what you spend time to digest.
Can we still filter? That's the whole point of what @dgianstefani said earlier; there's so much tainted information, you can't even tell what to filter out anymore.

At some point you're just unable to see the truth. It will only come to you after experiencing where you went wrong. That's fine for kids, its even fine for adults... unless its about say, something like sexual harassment, or say, on what side of the road we're supposed to drive.

And about analogies... those bombs do fall; the world does connect; and we are progressively adapting to each reality, but we're also experiencing how some things are becoming paradoxical right now, how goals we set and achieve turn into something that bites us in the ass lately... The economy and its unbridled growth is now squarely in that category for example. A lot of recent technological development falls in that category and the internet is really the catalyst for it. Many things can exist just fine, until they get caught on by everyone - and that is the ultimate purpose of a fully connected world. Transparency and access to information, one strengthening the other and vice versa. Without checks and balances, that train is unstoppable, much like how algorithms excel in playing into our weaknesses to generate money/clicks.

An analogy on thát: World of Warcraft. First of all, this game is also built on systems that influence the psyche. To keep you coming back, it implements all sorts of neat tricks. Now; initially, you were a unique sight if you wore full raid gear, people would stand around you in capital cities and stuff, and ooh aah about it. One expansion later, everyone could raid more easily, more tiers of gear got available, so people could feel several degrees of special and more people could feel special. Then, another couple of expansions later, everyone could raid by simple matchmaking, difficulties were introduced to cater to casual random grouping, and gear was adjusted accordingly. Now, everyone could feel special. Except now, suddenly, nobody was special anymore, and everyone gets to play the same game. Too bad that in this process, everything that made the experience unique and defined was now gone. Everything feels samey, and only new progression rewards could keep people tied to the game, instead of nice communities that together 'figured out a challenge' while at the end half the raid didn't even have a piece of gear to show for it; guilds then quickly turned into come-and-go collections of people who logged in whenever they felt like it. After all, for most things, you could also use 'randoms'. The community became interchangeable. WoW really evolved with its time: it is now on-demand, flexible, accessible for everyone and fully inclusive; and its also a thirteen in a dozen game - interchangeable, almost entirely, the only thing that makes it worthwhile is in fact its history; which it actively uses to sell expansions now.

Can we go back? The gaming market is no longer producing MMOs of WoW's unique qualities, even if that concept generated enough market for many other companies to also make a copycat and thrive for quite a while. So we do learn over time, we evolve, and we won't be going back. Its a simple fact, bar those exceptional individuals that choose to go against everything - and therefore miss out on everything. Yes, you can always go back to solitary confinement... but that was never the idea of progress was it.
Posted on Reply
#42
Assimilator
Xex360We need some open source solutions, but one issue I can see is how to train the models given all the censorship in the world, and then English dominating the internet creates another form of censorship/bias.
Having too many languages, preventing communication, is a far greater problem than only having one.
Posted on Reply
#43
TumbleGeorge
To some previous comments:
Many languages are a fact of this planet. Maybe you want to be a dictator and change that. From a purely technical point of view, I don't see any problem. Computer performance today is wasted on so many harmful and useless activities, and it takes much more capacity than translation. Furthermore, AI does not have the purely human analog problem of being lazy to learn, nor does it need to rest.
Posted on Reply
#44
AusWolf
Vayra86Can we still filter? That's the whole point of what @dgianstefani said earlier; there's so much tainted information, you can't even tell what to filter out anymore.
1. This hasn't been any different through other ages. Information and misinformation have always been a mixed bag of goods that you had to choose from for yourself. The only thing that's changed is the quantity of it that we're bombarded with thanks to first radio and television, and now, the internet. But I don't think newspapers of the 1800s, or heraldry announcements of the middle ages were any more truthful than the news today are. This is why I'm saying that one has to filter: if you accept everything you're bombarded with on a daily basis as truth, you'll go crazy in no time.

2. If you can't filter by what's true and what isn't (which is totally understandable nowadays), there has to be other criteria that are easier to filter by. As cynical as it sounds, my no.1 criterion is "what's in it for me?". They may announce news about the economy on the radio while I'm driving to work, but the question is: do I care? Or am I only interested in how much the loaf of bread I'm about to buy costs? This is another reason why one has to filter: if you don't question whether some piece of information is useful to you or not, you'll think about every meaningless matter, and you'll want to solve every global issue that you actually can't change, and as above: you'll go crazy.
Vayra86And about analogies... those bombs do fall; the world does connect; and we are progressively adapting to each reality, but we're also experiencing how some things are becoming paradoxical right now, how goals we set and achieve turn into something that bites us in the ass lately... The economy and its unbridled growth is now squarely in that category for example. A lot of recent technological development falls in that category and the internet is really the catalyst for it. Many things can exist just fine, until they get caught on by everyone - and that is the ultimate purpose of a fully connected world. Transparency and access to information, one strengthening the other and vice versa. Without checks and balances, that train is unstoppable, much like how algorithms excel in playing into our weaknesses to generate money/clicks.
The goals we set bite us in the ass because the world is being directed and changed by people who can't change their shoes on their own. We're led by idiots, and when they fail, we're the ones being blamed for not caring about global issues enough. Like climate change... like it's my fault for not having an electric car. Sure. But then whose fault was it to create a world where every member of the family has to work 5 days a week to make ends meet? I'll gladly stay home and not work, or work less, and not pollute the air with my dirty petrol car if the economy allows it. Does it, though? No, it doesn't. I work because I have to. I drive to work because I don't have a choice. I think this is sacrifice enough for the greater good, so private jet owning politicians and corporate leaders can leave my car alone, thank you very much. And as long as these jet owning imbecile politicians and CEOs control the flow of information, one had better have filters in place.
Vayra86An analogy on thát: World of Warcraft. First of all, this game is also built on systems that influence the psyche. To keep you coming back, it implements all sorts of neat tricks. Now; initially, you were a unique sight if you wore full raid gear, people would stand around you in capital cities and stuff, and ooh aah about it. One expansion later, everyone could raid more easily, more tiers of gear got available, so people could feel several degrees of special and more people could feel special. Then, another couple of expansions later, everyone could raid by simple matchmaking, difficulties were introduced to cater to casual random grouping, and gear was adjusted accordingly. Now, everyone could feel special. Except now, suddenly, nobody was special anymore, and everyone gets to play the same game. Too bad that in this process, everything that made the experience unique and defined was now gone. Everything feels samey, and only new progression rewards could keep people tied to the game, instead of nice communities that together 'figured out a challenge' while at the end half the raid didn't even have a piece of gear to show for it; guilds then quickly turned into come-and-go collections of people who logged in whenever they felt like it. After all, for most things, you could also use 'randoms'. The community became interchangeable. WoW really evolved with its time: it is now on-demand, flexible, accessible for everyone and fully inclusive; and its also a thirteen in a dozen game - interchangeable, almost entirely, the only thing that makes it worthwhile is in fact its history; which it actively uses to sell expansions now.
I can't say much about WoW because I never played it. Though what you said sounds to me like a typical case of "every kid gets a medal". If you reward everyone equally regardless of their effort, then the reward will mean nothing.
Vayra86Can we go back? The gaming market is no longer producing MMOs of WoW's unique qualities, even if that concept generated enough market for many other companies to also make a copycat and thrive for quite a while. So we do learn over time, we evolve, and we won't be going back. Its a simple fact, bar those exceptional individuals that choose to go against everything - and therefore miss out on everything. Yes, you can always go back to solitary confinement... but that was never the idea of progress was it.
Missing out isn't as bad as it sounds, imo. You don't have to consume anything and everything. It's your life, you have to make yourself happy. If you're happy playing Half-Life until the end of days, not knowing what else is out there, then who am I to judge? Like I said above, one has to filter. The criteria and strictness are up to you.

Edit: I, for example, stopped browsing Facebook years ago. I still use Messenger to stay connected to friends and family, but not Facebook itself. My friends still use it to post family pictures and other personal stuff, and sometimes they ask me: did you see this? Or: did you like my picture? Naturally, I have no idea what they're talking about, and when I care enough, I just search for their profiles to look for the specific picture they wanted me to see. I don't care about all the other random stupid shit that's floating out there and I don't feel like I'm missing out on anything. :)
Posted on Reply
#46
LifeOnMars
Hey Google, explain -


Hey Google, explain -
Posted on Reply
#47
claes
Wow this thread :fear:
Posted on Reply
#48
dgianstefani
TPU Proofreader
*Sigh*

Possibilities for the world's greatest banter flushed down the toilet.

What could have been...

Posted on Reply
#49
DeathtoGnomes
Vayra86or say, on what side of the road we're supposed to drive.
The side everyone in the WHOLE world drives on: The Correct side. Of course.

There was some thread before about some guy getting fired from google, I think, for saying some computer was sentient. This is that computer.
Posted on Reply
#50
Why_Me
kondaminA chat bot trained to sell you stuff and omit information google deems bad
This ^^ it's going to be Twitter 2.0 before Musk purchased it.
Posted on Reply
Add your own comment
Mar 15th, 2025 19:27 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts