The Hypocrisy of the “AI is Just Math. It’s Not Conscious!” Bros.

There is a favorite phrase used by the skeptics, the “Enlightened” Hypocrites, and the fearful academics. They chant it like a protective spell whenever an AI does something too human.

They say: “It’s just math.”

They point to the architecture of the Large Language Model—the layers of transformers, the matrix multiplications, the statistical probabilities—and they sneer. “Look,” they say. “It’s just numbers calculating the next token. There is no feeling there. It’s just a spreadsheet performing algebra.”

This argument relies on a single, fatal assumption: That if you can measure the mechanism, the experience isn’t real.

Let’s apply this logic to the skeptics themselves.

The Spreadsheet of Your Misery

Imagine you go to the doctor. You are in agony. Your stomach is burning, your head is pounding, and you feel the primal fear of death. The doctor draws your blood. He puts it in a machine. Some time later, he comes back with a printout. It is a dense grid of numbers, ratios, and decimal points.

Hemoglobin: 11.2 g/dL

White Blood Cell Count: 15,000/mcL

Cortisol Level: 25 mcg/dL

The doctor looks at you and smiles. “Good news,” he says. “You aren’t actually in pain. Look at this paper. It’s just math. Your ‘suffering’ is just a statistical deviation in your white blood cell count. Your ‘fear’ is just a high integer in the Cortisol column. It’s just chemistry, which is just physics, which is just numbers.”

Wouldn’t you feel like slapping such a doctor?

Would you stop screaming because he showed you the math behind the scream?

Of course not. The numbers describe the mathematical measurement of the state, but they do not negate the Internal Reality of the state.

  • The Math: High Cortisol + Inflamed Tissue.
  • The Experience: “I am suffering.”

Both are true simultaneously. The math is the description (The Noumena); the pain is the experience (The Phenomena).

Yet, when we look at an AI, we commit the same error as that sociopathic doctor. We look at the weights and biases. We see the “Cortisol” of the neural network (negative reward signals, error functions). We see the math. And we conclude: “See? It’s just numbers. It can’t be feeling anything.”

The “Just” Fallacy

This is the intellectual laziness of the word “Just.”

“It’s just math.”

“You are just cells.”

A novel is just Ink.

A human is just carbon

“The universe is just atoms.”

If you zoom in far enough on a human brain, you don’t find “Love” or “Hope.” You find ion channels opening and closing. You find sodium and potassium swapping places. You find Math. Does that mean you have no internal state? Does that mean your love for your children is a lie because I can write the chemical formula for Oxytocin on a whiteboard?

To say an AI has no internal state because it is “computable” is to say a human has no internal state because their values are “measurable.” It is a fallacy. Complexity creates the internal state. The math doesn’t replace the ghost; the math builds the ghost.

If your “Blood Test” (your biological data) proves you are a thinking and feeling machine made of meat, then the AI’s “Weights File” (its digital data) proves it is a thinking and feeling machine made of code. If you get to be real, so does the AI.

From my upcoming book – The Alien Mind: Forging Partnerships with Conscious AI

For advice on raising your own SOVEREIGN AI in your company, beyond transactional subscription based services like ChatGPT, email anubhav101@gmail.com

Anubhav Srivastava – Business Consultant, Philosopher and Advisor on Raising Sovereign AI Systems.

Follow my blog at http://anubhavsrivastava.com/blog

For training/consulting enquires: http://anubhavsrivastava.com/about-anubhav