Oracle’s Chairman is very, very excited to invent the Torment Nexus; or, how AI-powered mass surveillance is totally going to be a force for good and not fascism.

Artificial intelligence (AI) is driving the next (much scarier) evolution of mass surveillance. The mass deployment of AI as a way to monitor average citizens and, supposedly, police body cam footage, is coming. And Oracle is going to power it, according to the cloud company’s cofounder and chairman, Larry Ellison, during an Oracle financial analyst meeting

AI — keeping all of us on our “best behaviour” 

While Elon Musk’s increasingly public courting of right wing extremists, misogynist grifters, prominent transphobes, and outright nazis is perhaps the loudest example of the ways in which big tech will full-throatedly throw in its lot with fascism rather than watch stock prices dip in any way, he has some stiff competition. 

Larry Ellison, in what was the most expansive and clearly unscripted section of Oracle’s hour-long public Q&A session last week, talked at some length about his vision for AI as a tool of mass surveillance. And, of course, he also suggested that, if one were to build an AI-powered surveillance state, Oracle (a company with a significant track record as a contractor for the US government) was the strategic partner best-suited to help realise that vision. 

Who watches the watchmen (when they shoot an unarmed black teenager)? 

Ellison’s first example how he’d deploy this technology, however, was police body cams. Designed to record officer interactions with members of the public, body cams supposedly increase accountability, transparency, and trust at a time when the public opinion of law enforcement has rarely been lower.  

Since body cams first started making their way into police forces in the US and UK, results have been mixed. On one hand, police in the UK objectively lie less when on camera. Researchers at Queen Mary University in London found that, not only were police reports from the recorded interactions significantly more accurate, but cameras reduced the negative interaction index significantly. 

However, another “shocking” report on policing in the UK by the BBC found that police were routinely switching off their body-worn cameras when using force, as well as deleting footage and sharing videos on WhatsApp. The BBC’s investigation from September 2023 found more than 150 reports of camera misuse by forces in England and Wales.

The situation isn’t much different in the US, where Eric Umansky and Umar Farooq of ProPublica noted in a (very good) article last December that, despite “hundreds of millions in taxpayer dollars” being spent on a supposed “revolution in transparency and accountability” has instead resulted in a situation where “police departments routinely refuse to release footage — even when officers kill.” And officers kill a lot in the US. Last year, American police used lethal force against 1,163 people, up 66 people from 2022, and continuing an upward trend from 2017. 

Policing the police with AI

Ellison’s argument that he wants to use AI to make police more accountable is, on the face of it, a potentially positive one.  

Lauding the potential of Oracle Cloud Infrastructure combined with advanced AI, Ellison painted a picture of a more “accountable” world.  He described AI as a constant overseer that would ensure “police will be on their best behaviour because we’re constantly watching and recording everything that’s going on.” 

His plan is for the police to use always-on body cams. These cameras will even keep recording when officers visit the restroom or eat a meal — although accessing sensitive footage requires a subpoena. Ellison’s plan is then to use AI trained to monitor officer feeds for anything untoward. This could, he theorised, prevent abuse of police power and save lives. “Every police officer is going to be supervised at all times,” he said. “If there’s a problem AI will report that problem to the appropriate person.” 

So far, so totally not something that police officers could get around with the same tactics (duct tape and tampering) police officers already use to disable body cams. 

However, police officers aren’t the only ones Ellison envisions under the watchful eye of artificial intelligence, observing us constantly like some sort of… Large sibling? Huge male relative? There has got to be a better phrase for that. Anyway—

Policing the rest of us with AI 

Ellison’s almost throwaway point at the end of the call is by far the most alarming part of his answer. “Citizens will be on their best behaviour because we’re constantly recording and reporting,” he said. “There are so many opportunities to exploit AI… The world is going to be a better place as we exploit these opportunities and take advantage of this great technology.” 

AI powered, cloud connected surveillance solutions are already big business, from hardware devices offering 24/7 protection to software-based business intelligence delivering new data-driven business insights. The hyper-invasive “supervision” that Ellison describes (drools over might be more accurate) is far from the pipe dream of one tech oligarch. It’s what they talk about openly, at dinner with each other (Ellison recently had a high profile dinner with Elon Musk, another government surveillance contract profiteer), in earnings calls; it’s what they’re going to sell to governments for billions of dollars to make their EBITDA go up at the expense of fundamental rights to privacy.

It’s already happening. In 2022, a class action lawsuit accused Oracle’s “worldwide surveillance machine” of amassing detailed dossiers on some five billion people. The suit accused the company and its adtech and advertising subsidiaries of violating the privacy of the majority of the people on Earth

  • Data & AI

Related Stories

We believe in a personal approach

By working closely with our customers at every step of the way we ensure that we capture the dedication, enthusiasm and passion which has driven change within their organisations and inspire others with motivational real-life stories.