>

Artificial intelligence will become strong enough to be a concern, says Bill Gat

  • Chin
  • Topic Author
  • Visitor
  • Visitor
13 Feb 2016 14:41 #291796 by Chin

Artificial intelligence will become strong enough to be a concern, says Bill Gates
Former Microsoft boss joins Elon Musk and Stephen Hawking in suggesting that the march of AI could be an existential threat to humans
Bill Gates is among the worriers about the future potential of artificial intelligence.

Bill Gates is the latest prominent figure from the technology industry to express concern about the future evolution of artificial intelligence, although he thinks it will be “decades” before super-intelligent machines pose a threat to humans.

He joins Elon Musk and Stephen Hawking in suggesting that the march of AI could be an existential threat to humans. The former Microsoft boss gave his opinion during his latest Ask Me Anything (AMA) interview on the Reddit networking site.

“I am in the camp that is concerned about super intelligence. First the machines will do a lot of jobs for us and not be super-intelligent. That should be positive if we manage it well,” wrote Gates.

“A few decades after that though the intelligence is strong enough to be a concern. I agree with Elon Musk and some others on this and don’t understand why some people are not concerned.”

Musk spoke out in October 2014 during an interview at the AeroAstro Centennial Symposium, telling students that the technology industry should be thinking hard about how it approaches AI advances in the future.

“I think we should be very careful about artificial intelligence. If I had to guess at what our biggest existential threat is, it’s probably that. So we need to be very careful,” said Musk. “I’m increasingly inclined to think that there should be some regulatory oversight, maybe at the national and international level, just to make sure that we don’t do something very foolish.”

‘AI doomsday scenarios belong more in the realm of science fiction’
In a December interview, Professor Hawking went further. “The primitive forms of artificial intelligence we already have, have proved very useful. But I think the development of full artificial intelligence could spell the end of the human race.”

Future risks of artificial intelligence are being discussed widely and publicly within the technology industry, and even researchers who think warnings about machines extinguishing the human race are nonsense, are alive to the need to continue exploring the risks.

“AI doomsday scenarios belong more in the realm of science fiction than science fact. However, we still have a great deal of work to do to address the concerns and risks afoot with our growing reliance on AI systems,” admitted an article co-written earlier in January by Eric Horvitz, director of the Microsoft Research lab, and Tom Dietterich, president of the Association for the Advancement of Artificial Intelligence.

That post outlined three key risks around artificial intelligence: programming errors in AI software; cyber-attacks on AI systems by criminals, terrorists and government-backed hackers; and so-called Sorcerer’s Apprentice scenarios, when AI systems respond to human instructions in unexpected (and possibly dangerous) ways.

“Each of the three important risks outlined above... is being addressed by current research, but greater efforts are needed,” wrote Horvitz and Dietterich, calling for more collaboration and funding to explore the challenges. “We must not put AI algorithms in control of potentially-dangerous systems until we can provide a high degree of assurance that they will behave safely and properly.”

‘Technology is not making people less intelligent’
During his AMA interview, Gates also talked about his work on a “personal agent” technology within Microsoft that will “remember everything and help you go back and find things and help you pick what things to pay attention to... it will work across all your devices”.

He also described the bitcoin cryptocurrency as “exciting” but said it wasn’t currently viable for use in the developing world. “For our [Bill and Melinda Gates] Foundation work we are doing digital currency to help the poor get banking services.

“We don’t use bitcoin specifically for two reasons,” he wrote. “One is that the poor shouldn’t have a currency whose value goes up and down a lot compared to their local currency. Second is that if a mistake is made in who you pay then you need to be able to reverse it so anonymity wouldn’t work.”

Gates was also asked whether technology “has made the masses less intelligent”. He replied: “Technology is not making people less intelligent. Technology is letting people get their questions answered better so they stay more curious. It makes it easier to know a lot of topics which turns out to be pretty important to contribute to solving complex problems.”

Please Log in or Create an account to join the conversation.

  • rz3300
  • Visitor
  • Visitor
13 Feb 2016 16:16 #291810 by rz3300
Sounds like a really good idea for a science fiction novel.  I guess if there is one man who I would really be willing to sit down and get their opinion on matters like this it would be Bill Gates, but I think that he might be stretching a little bit here.  It is a fascinating subject though and I hope I get to see something in my lifetime.

Please Log in or Create an account to join the conversation.

  • jade
  • Topic Author
  • Visitor
  • Visitor
05 Mar 2016 11:15 #295036 by jade
Oh well and all the basic jobs are disappearing fast. Technology can accomplish so much nowadays and many of ys are left unemployed. :(
Of course there will always be some jobs that will not be touched by tech, like artists, psychologists, lawyers etc. but those professions are not for everyone. So what are the other people gonna do for a living?

Please Log in or Create an account to join the conversation.

  • briannagodess
  • Topic Author
  • Visitor
  • Visitor
05 Mar 2016 15:14 #295095 by briannagodess
I think AI is definitely an area of concern. As time goes by, we become more and more reliant to technology and machines. We become victims even of our own inventions. I mean, look around you, people are so engrossed in cellphones and gadgets. There's not much interaction anymore.

And I think we need to be able to control the machines, instead of them controlling us. That way, we can work efficiently as humans and not rely on machines to do the work for us. It makes us lazy, dependent and more at risk of losing jobs that are supposed to be for us.

Please Log in or Create an account to join the conversation.

  • Observer
  • Visitor
  • Visitor
06 Mar 2016 15:54 #295248 by Observer
It is already a cause for concern because it's existence is eliminating a large amount of jobs, which is contributing to the unemployment problem. There may be a greater cause for a concern, however, because with the technology gradually becoming more advanced, these machines may eventually develop a mind of their own, and then become a serious threat to humans

Please Log in or Create an account to join the conversation.

Time to create page: 0.150 seconds
MaleahBREAKING: The government of Pakistan has said that Pakistan will boycott their T20 World Cup match against India(01.02.2026, 11:02)(11:02)0
ketchimGot Florida Hass theodday from my buddy visiting here !(22.01.2026, 19:37)(19:37)0
ketchimICC tell Bangladseh they will be REPLACED !(22.01.2026, 19:17)(19:17)0
MaleahGuyanese people in Florida can't just go and catch a dozen or two dozen HASSA; they have to catch over 5 million.
This is called Greed
(07.01.2026, 13:14)(13:14)1
MaleahNow that Joe Root has 2 centuries in Australia, I assume those Australian fans, who said he couldn’t be classed as great unless he achieved that, will now say he is?? Given that the great Steve Smith has never scored a test ton in Pakistan….(05.01.2026, 12:31)(12:31)0
MaleahThe Bangladesh Cricket Board has formally asked the ICC to move all of Bangladesh’s matches out of India, citing safety and security concerns.

#T20WorldCup
(04.01.2026, 14:18)(14:18)0
Gwen20(03.01.2026, 13:42)(13:42)0
Gwen(select 198766*667891 from DUAL)(03.01.2026, 13:42)(13:42)0
Gwen(select 198766*667891)(03.01.2026, 13:42)(13:42)0
Gwen@@iBQ3X(03.01.2026, 13:42)(13:42)0
Gwen20'"(03.01.2026, 13:42)(13:42)0
Gwen20(03.01.2026, 13:42)(13:42)0
Gwen20'||DBMS_PIPE.RECEIVE_MESSAGE(CHR(98)||CHR(98)||CHR(98),15)||'(03.01.2026, 13:42)(13:42)0
Johan20(03.01.2026, 13:42)(13:42)0
Gwen20*DBMS_PIPE.RECEIVE_MESSAGE(CHR(99)||CHR(99)||CHR(99),15)(03.01.2026, 13:41)(13:41)0
Gwen20F4owsBb6')) OR 756=(SELECT 756 FROM PG_SLEEP(15))--(03.01.2026, 13:41)(13:41)0
Gwen20axQfaI3h') OR 505=(SELECT 505 FROM PG_SLEEP(15))--(03.01.2026, 13:40)(13:40)0
Gwen20GCVWFMgw' OR 960=(SELECT 960 FROM PG_SLEEP(15))--(03.01.2026, 13:40)(13:40)0
Gwen20-1)) OR 426=(SELECT 426 FROM PG_SLEEP(15))--(03.01.2026, 13:39)(13:39)0
Gwen20-1) OR 573=(SELECT 573 FROM PG_SLEEP(15))--(03.01.2026, 13:39)(13:39)0
Gwen20-1 OR 604=(SELECT 604 FROM PG_SLEEP(15))--(03.01.2026, 13:38)(13:38)0
Gwen20ZWzru47i'; waitfor delay '0:0:15' --(03.01.2026, 13:38)(13:38)0
Gwen20-1 waitfor delay '0:0:15' --(03.01.2026, 13:38)(13:38)0
Gwen20-1); waitfor delay '0:0:15' --(03.01.2026, 13:37)(13:37)0
Gwen20-1; waitfor delay '0:0:15' --(03.01.2026, 13:36)(13:36)0
Gwen(select(0)from(select(sleep(15)))v)/*'+(select(0)from(select(sleep(15)))v)+'"+(select(0)from(select(sleep(15)))v)+"*/(03.01.2026, 13:36)(13:36)0
Gwen200"XOR(20*if(now()=sysdate(),sleep(15),0))XOR"Z(03.01.2026, 13:36)(13:36)0
Gwen200'XOR(20*if(now()=sysdate(),sleep(15),0))XOR'Z(03.01.2026, 13:35)(13:35)0
Gwen20*if(now()=sysdate(),sleep(15),0)(03.01.2026, 13:35)(13:35)0
Gwen-1" OR 18=18 or "FwfsM7AR"="(03.01.2026, 13:34)(13:34)0
Gwen-1" OR 3*2<5 or "FwfsM7AR"="(03.01.2026, 13:34)(13:34)0
Gwen-1" OR 5*5=26 or "FwfsM7AR"="(03.01.2026, 13:34)(13:34)0
Gwen-1" OR 5*5=25 or "FwfsM7AR"="(03.01.2026, 13:34)(13:34)0
Gwen-1' OR 641=641 or 'eESQ4mw4'='(03.01.2026, 13:34)(13:34)0
Gwen-1' OR 3*2<5 or 'eESQ4mw4'='(03.01.2026, 13:34)(13:34)0
Gwen-1' OR 5*5=26 or 'eESQ4mw4'='(03.01.2026, 13:34)(13:34)0
Gwen-1' OR 5*5=25 or 'eESQ4mw4'='(03.01.2026, 13:34)(13:34)0
Gwen-1" OR 3*2>5 --(03.01.2026, 13:34)(13:34)0
Gwen-1" OR 3*2>999 --(03.01.2026, 13:34)(13:34)0
Gwen-1" OR 5*5=25 --(03.01.2026, 13:34)(13:34)0
Gwen-1' OR 5*5=26 --(03.01.2026, 13:34)(13:34)0
Gwen-1 OR 3*2>5(03.01.2026, 13:34)(13:34)0
Gwen-1 OR 3*2>999(03.01.2026, 13:34)(13:34)0
Gwen-1 OR 5*5=25(03.01.2026, 13:34)(13:34)0
Gwen-1 OR 3*2>5 --(03.01.2026, 13:34)(13:34)0
Gwen-1 OR 3*2>999 --(03.01.2026, 13:34)(13:34)0
Gwen-1 OR 5*5=25 --(03.01.2026, 13:34)(13:34)0
Gwen20(03.01.2026, 13:34)(13:34)0
Gwen204tYynwAI(03.01.2026, 13:34)(13:34)0
Gwen20(03.01.2026, 12:02)(12:02)0
Nickolas
Go to top