One reason could be the lack of scientific basis for the extreme intelligence leap in singularity. We haven't even fully understood human intelligence, let alone how to create something far beyond it suddenly. It seems like a wild guess.
Chomsky may consider singularity as science fiction because it doesn't take into account the complex nature of human society and culture. Machines achieving singularity would have to interact with a world shaped by human values, politics, and history. The concept often simplifies these aspects and assumes a smooth transition to a post - singularity world, which is unrealistic.
He might think so because current technological progress doesn't point towards an imminent singularity. Our progress in artificial intelligence, for example, is still in the early stages of emulating certain aspects of human intelligence. The singularity concept assumes a much faster and more radical evolution that is not in line with the observed pace of technological development.