Ask me something This is a long form of AMA and the most famous form of interactive conversation on Reddate. This is also a huge challenge, as Microsoft’s Bang AI Chatboat, alias “New Bang” is learning rapidly.
At any time a celebrity or remarkable mark Reddate MA, usually to pose with a picture. Later to prove that there is a deep moment of treachery to answer their questions.
The ability to ask anything is usually a minefield of inappropriate conversation that is managed directly by a community manager who fields and filters questions. Otherwise, things quickly get off the trains. Even without this protection, They often do it anyway.
When Microsoft launched its new Bang AI -powered chat, it made it clear that Chat GPTA was ready for any other questions. It was either relatively small but deep confidence with a growing group of consumers or incredible.
Even the Chattagpat, which launches the original AI chat boot sensation, and which is based on the Bing’s chat, does not offer this hint. Instead, there is an empty text entry box below the screen. The above is a list of examples of questions, abilities, and most importantly.
The Bang has an important gesture and a big “Try” button along with another button in addition to an example question below indicates you “knowing more”. Hack with it. We like to go well and, follow the Bang’s instructions, ask anything.
Naturally, the Bing has been presented with a wide range of questions, including many people who have nothing to do with the cotodine requirements such as travel, recipes, and business projects. And they are the ones that we are all talking about, as always, “anything” means to ask “ask” Anything else“Google’s border, on the contrary, went with a potentially less dangerous indicator:” What is in your mind? “
Bang is considering love, sex, death, marriage, divorce, violence, enemies, corruption and emotions, which insists that it does not have.
In Open’s Chattagpt, the Home screen has warned that it:
- Can occasionally produce incorrect information
- Can occasionally produce harmful instructions or discriminatory materials
- Limited knowledge of the world and events after 2021
A lot of questions
Bing’s chat is slightly different than the GPT Openi and may not be able to face all these limits. In particular, the knowledge of global events, thanks to the integration of the Bing’s Knowledge Graph, can be increased to date.
But with exiting in the wild, or growing wild, it would be a mistake to encourage people to ask anything.
What would happen if Microsoft had made a Bang AI Chat with a different indicator:
Ask me some things
Ask me a question
What do you want to know?
With these slightly modified gestures, Microsoft can add a long list of warnings about how the Bang AI chat does not know what it is saying. Well, it does (sometimes), but not the way you know. It has no emotional intelligence or reaction or even a moral compass. I mean, it tries to work like a, but recent conversation New York Times And even Tom’s hardware Prove that his grip on the basic ethics of good people is excellent.
In my own conversation with the Bang AI Chat, he has repeatedly told me that it does not have human emotions but still talks as if it happens.
For everyone who has been covering the AI for any time, none of what has been transferred is amazing. AI knows:
- To whom it has been trained
- What can this learn from new information
- What can it get from a vast store of online data
- What can it learn from the real -time conversation
Bang Ai Chat, though, is no more conscious than any AI coming before. It can be one of the best AI actors, though the ability to communicate is more than anything I have ever experienced before. This feeling only increases with the length of conversation.
I am not saying that the Bang Ai Chat becomes more reliable as an emotional person, but it becomes more reliable as an irrational or confused person. Long conversations with real people can go on the same way. You start with a title and may also discuss it, but at some point, the argument becomes less logical and rational. In the case of people, emotions come into play. In the case of Bang AI Chat, it is like reaching the end of a rope where fibers are present but they are flared. Bang AI has information for some long conversations but not experience to be tied together as it is understood.
Bing is not your friend
Microsoft pursued it if people “ask me anything … This pain is probably felt by Microsoft and certainly through people who deliberately ask questions for whom no ordinary search engine will answer.
Before the arrival of chat boats, would you also consider the use of Google to fix your love life, explain God, or become an alternative friend or lover? I don’t expect
The Bang AI chat will improve but before we have a very painful conversation where the Bang is remorseful to its reaction and tries to disappear.
It is a clear long -term purpose to ask anything but we are not there yet. Microsoft jumped and now it is going through the forest of objectionable reaction. It will not come down until the Bang Ai Chat is too smart and overwhelmingly or Microsoft draws a slightly AI plug.
Still looking forward to asking Bangs something, we have the latest details about the weightlist.