This is dangerous on many different levels. The negative parts of American history are important to fully understand America, and maybe even in a sense to be able to truly appreciate the society as it stands today. Removing negative opinions will simply further the indoctrination that all people living in the US already receive
Emphasizing negative opinions because they are things we have to remember, however, can create a generation that views the US as a horrible, corrupt, nation and will be against any action it takes to intervene globally. We don't want a US that sits and waits as other nations can do their thing. Russia, China, and other nations are not going to be content with their nations as they are. A balanced history is best, a neutral history is best. History books should go over every important topic that happens, and shouldn't emphasize either negative or positive traits. These republicans may very well have a point behind what they say. However, I do think it's more likely they are scared of classes actually being neutral to history rather than covering things up.
Even though it's a state issue, I wonder if this mentality will ever have a significant impact on foreign politics? If we can't admit how the Cold War was an overwhelmingly negative thing on pretty much all continents, or that we committed genocide against Native Americans in our rush to expand west, I can't imagine it will contribute positively to the world's opinion of the USA.