I’m a high-tech thriller author, and in my next thriller I’m toying with the idea of adding a scene where the villain "tortures" an artificial intelligence for kicks.
I don’t believe I’ve ever seen that in any novels or screenplays, so I wanted to tap into the wisdom of the Reddit crowd to ask:
Is that plausible? How would one torture an artificial intelligence? Or treat it cruelly or inflict simulated pain?
Happy to do additional readings if steered to an on-point post or article. Thanks for any insights or theories, and let’s get the Donald Trump jokes out of the way up front (eg, Just feed it a constant video loop of Trump rallies).
Artificial intelligence is set to create more than 7m new UK jobs in healthcare, science and education by 2037, more than making up for the jobs lost in manufacturing and other sectors through automation, according to a report.
A report from PricewaterhouseCoopers argued that AI would create slightly more jobs (7.2m) than it displaced (7m) by boosting economic growth. The firm estimated about 20% of jobs would be automated over the next 20 years and no sector would be unaffected.
AI and related technologies such as robotics, drones and driverless vehicles would replace human workers in some areas, but also create many additional jobs as productivity and real incomes rise and new and better products were developed, PwC said.
Professional, scientific and technical services, including law, accounting, architecture and advertising firms, are forecast to get the second-biggest boost, gaining nearly half a million jobs, while education is set to get almost 200,000 extra jobs.
John Hawksworth, the chief economist at PwC, said: “Healthcare is likely to see rising employment as it will be increasingly in demand as society becomes richer and the UK population ages. While some jobs may be displaced, many more are likely to be created as real incomes rise and patients still want the ‘human touch’ from doctors, nurses and other health and social care workers.
“On the other hand, as driverless vehicles roll out across the economy and factories and warehouses become increasingly automated, the manufacturing and transportation and storage sectors could see a reduction in employment levels.”
PwC estimated the manufacturing sector could lose a quarter of current jobs through automation by 2037, a total of nearly 700,000.
Transport and storage are estimated to lose 22% of jobs – nearly 400,000 – followed by public administration and defence, with a loss of almost 275,000 jobs, an 18% reduction. Clerical tasks in the public sector are likely to be replaced by algorithms while in the defence industry humans will increasingly be replaced by drones and other technologies.
London – home to more than a quarter of the UK’s professional, scientific and technical activities – will benefit the most from AI, with a 2.3% boost, or 138,000 extra jobs, the report said. The east Midlands is expected to see the biggest net reduction in jobs: 27,000, a 1.1% drop.
I’m aware this is a change from the usual content of this subreddit, but I do not know a better place to post this(feel free to tell if you do).
Anyway, I’m a student in Europe that would like to do a master related to AI/Machine Learning(ML) in Europe. I would like to go to the US/UK, where the leading universities are, but they are generally far too expensive for me. Most other universities in Europe(excluding the UK) would likely be more affordable for me. Unfortunately, I haven’t been able to find this information successfully online. (The rankings that do exists are also usually focused on research, instead of education.) I was hoping the people in this subreddit would be able to help me.
The relevant information on this point could have a very big impact on my future career, so any help is appreciated by a great amount!
Cornell University researchers have developed a prototype of a robot that can express “emotions” through changes in its outer surface. The robot’s skin covers a grid of texture units whose shapes change based on the robot’s feelings.
Assistant professor of mechanical and aerospace engineering Guy Hoffman, who has given a TEDx talk on “Robots with ‘soul'” said the inspiration for designing a robot that gives off nonverbal cues through its outer skin comes from the animal world, based on the idea that robots shouldn’t be thought of in human terms.
“I’ve always felt that robots shouldn’t just be modeled after humans or be copies of humans,” he said. “We have a lot of interesting relationships with other species. Robots could be thought of as one of those ‘other species,’ not trying to copy what we do but interacting with us with their own language, tapping into our own instincts.”
Their work is detailed in a paper, “Soft Skin Texture Modulation for Social Robots,” presented at the International Conference on Soft Robotics in Livorno, Italy. Doctoral student Yuhan Hu was lead author; the paper was featured in IEEE Spectrum, a publication of the Institute of Electrical and Electronics Engineers.
Hoffman and Hu’s design features an array of two shapes, goosebumps and spikes, which map to different emotional states. The actuation units for both shapes are integrated into texture modules, with fluidic chambers connecting bumps of the same kind.
The team tried two different actuation control systems, with minimizing size and noise level a driving factor in both designs. “One of the challenges,” Hoffman said, “is that a lot of shape-changing technologies are quite loud, due to the pumps involved, and these make them also quite bulky.”
Hoffman does not have a specific application for his robot with texture-changing skin mapped to its emotional state. At this point, just proving that this can be done is a sizable first step. “It’s really just giving us another way to think about how robots could be designed,” he said.
Future challenges include scaling the technology to fit into a self-contained robot — whatever shape that robot takes — and making the technology more responsive to the robot’s immediate emotional changes.
“At the moment, most social robots express [their] internal state only by using facial expressions and gestures,” the paper concludes. “We believe that the integration of a texture-changing skin, combining both haptic [feel] and visual modalities, can thus significantly enhance the expressive spectrum of robots for social interaction.”