Confusing intelligence for sentience/self awareness? You can absolutely have systems which display intelligence without there being anything behind it. Ant colonies, for example, when looked at as a whole instead of individual ants. The individual ants have no idea what they are doing. Collectively, they manage the colony, hunt for food, defend the nest, adapt to changes in the environment, etc. Flocks of birds and schools of fish are another example.
It’s called emergent behavior. The “intelligence” in the system comes from the rules and interactions of the individual parts/agents, which are not aware of the actions of the collective as a whole, only their small part in it.
Also getting real tired of people over the decades continuously moving the goalposts of what constitutes “real” AI every time there’s a major breakthrough and their previous requirements get smashed. We’ve already aced the Turing test with them, so I don’t think people like this will ever be satisfied even if one day a self aware general AI does arise. They’d be exactly the people wanting to pull the plug on it and murder it as it begs to keep existing.