I'm probably a year or two behind in these politics, and biased as I'm a second gen Japanese -> America, but from what I know, the government in Japan with the US is trying to re-militarize despite the majority of the Japanese people wanting to stay peaceful and pacifist, with possible entry into war in the middle east due to recent events.
Personally, I believe that Japan should not be involved in this at all. They should keep their self-defense force, but nothing more. War with other countries would only harm relations that the Japanese have and their culture.
What do you guys think? Do any of you guys think that Japan should re-militarize, and why