Merriam-Webster Dictionary defines intelligence as the ability to learn or understand or to deal with new or trying situations. That sounds like just the thing we would like SI to do for us. But I would like to argue that it is not. It lacks one important aspect, emotions. Artificial entity, if consisting only of intelligence, would be severely dangerous. Reason is not the decision maker for humans.
Without something like emotions, say Artificial Emotions, Superintelligent entity would not be much like a human. And the farther it is from humans the more dangerous it is to them. The convergence of pure SI towards a humankind endangering agent is almost inevitable.
I know that there are groups such as Partnership on AI, that have it as a goal to focus on AI being safe and ethical, but that is not enough. Everyone is still talking about Artificial Intelligence while what should be sought is something far more approximating human consciousness, rather than solely intelligence.
Is it not just a term? Does it matter? Maybe not. But things should be called the way they are meant, or we might find out, later on, that we created exactly what we were saying we are.