🔥30X Profit Expected from AIG Token🔥 AI Games has launched its native token (AIG). 1 AIG Token Price Is $0.01 & Exchange Listing Price $0.30, Don’t miss this opportunity; join the pre-sale at the official website, PlayAiGames.Online
Advertise here

The rise of AI in decision-making

AIG PRE SALE



The explosion of artificial intelligence (AI) is filling our world with either idealistic dreams or doomsday predictions, yet neither captures the real threat.

We’re not facing a “big robot rebellion” where AIs or robots stage a hostile takeover. Forget the movie scenes in The Terminator or The Matrix—that’s not on the cards. The technology for AIs to conquer the world doesn’t exist.

AI may be excelling in specific tasks, like folding proteins or playing chess, but it’s not capable of building armies or running nations.

Yet, this lack of a robot uprising doesn’t mean we’re safe. AI is quietly infiltrating our lives through less dramatic, but equally powerful means: digital bureaucracy.

The true danger isn’t killer robots. It’s the encroachment of AI into decisions that control our day-to-day lives. Humans have evolved to fear tangible threats. Think big predators, like lions and sharks.

But we’re less prepared to spot threats from paperwork, documents, or bureaucracy. Bureaucracy itself is a relatively new invention, having developed only 5,000 years ago with the creation of writing.

Before writing, ownership, for example, depended on what the community agreed upon. If you “owned” a plot of land, it was because your neighbors respected it. No documents were required. 

From clay tablets to digital bureaucrats

That all changed when people began using clay tablets and records. Written documents meant that property rights were no longer about community consensus but official records. Ancient Mesopotamians turned mud chunks into official ownership symbols, flipping the idea of ownership.

A court decision could be based on a clay tablet saying you owned a piece of land, even if the community didn’t agree. Fast-forward to today and our systems are still built on similar principles, only now we use silicon chips and digital records instead of clay.

This shift transformed power structures. Ownership became something that could be sold and traded without a nod from the local community. Bureaucracies emerged, making tax collection, military funding, and central government possible.

Bureaucrats became essential players in these systems, using records, forms, and stamps to manage armies, allocate resources, and even control laws. These bureaucratic systems gave rise to centralized states with extensive control over their populations, a control now shifting to AI.

AI doesn’t need to raise a robot army; it just needs to master bureaucratic systems. Within these frameworks, AI can make more influential decisions than any human, as seen today. AIs are already deciding if we get loans, job offers, college admissions, and even medical diagnoses.

Imagine AI bankers determining credit eligibility, AI judges ruling on court cases, or military AIs calculating strike targets. AI doesn’t need a robot rebellion. The bureaucratic power it’s inheriting is already enormous.

AI in social media and public influence

Social media algorithms, though primitive, are already wielding massive power. AI algorithms, particularly those from Facebook, X (formerly Twitter), YouTube, and TikTok, are crafted with one goal in mind:- User engagement. The longer people stay on these platforms, the more money flows to the corporations. 

Through trial and error with billions of users, these algorithms figured out that triggering emotions like greed, anger, and fear increases engagement. When they press these emotional buttons, they keep users hooked.

These algorithms didn’t stop at optimizing for time spent online. They went further, discovering that content provoking intense emotions boost user engagement. This discovery has resulted in the spread of conspiracy theories, misinformation, and social divisions.

The world today is flooded with digital outrage, fear-mongering, and polarization, thanks to algorithms that prioritize clicks over cohesion. By exploiting emotional triggers, social media algorithms have fueled today’s epidemic of conspiracy theories and fake news.

These algorithmic “deciders” aren’t inherently malicious. On the contrary, they’re optimized to perform their roles efficiently. Still, though, their decisions lack the human intuition or context that we usually expect in such important areas.

AI might make faster or more consistent judgments, but if something goes wrong, the results can be disastrous. This potential risk has already become visible in social media’s influence on society, serving as a warning for where AI’s unchecked power could lead.

Biden sets AI rules for Pentagon and intelligence agencies

President Joe Biden announced a national security memorandum with new rules for AI use in national security, preventing the Pentagon and intelligence agencies from using AI in ways that contradict democratic values.

This is the first directive guiding AI’s role in national security. The new rules will encourage AI experimentation while ensuring that government bodies don’t use AI to infringe on rights like free speech or undermine nuclear controls.

“Our memorandum directs the first-ever government-wide framework on our AI risk management commitments,” said National Security Adviser Jake Sullivan. He outlined goals to avoid bias, uphold accountability, and make sure human oversight of AI in sensitive roles.

Though not legally binding, the rules cover national security applications like cybersecurity, counter-intelligence, and logistics in military operations. Biden also issued export controls last year, slowing China’s AI advancements by restricting tech access.

Under the directive, the AI Safety Institute in Washington will be responsible for inspecting AI tools to prevent misuse before release, with the US intelligence community prioritizing monitoring AI advancements in other countries.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *