After 9-11, many people started believing that Islam taught hate, after several Arabian hijackers brought down the twin towers, many in the Middle Eastern countries giving thanks to Allah for what happened.

Contrary to this, the word Islam actually means peace.

So, what is your opinion....does Islam teach peace or hate? Or does it depend on the certai branch of Islam?