I've noticed that whenever someone starts talking about "the parents", people want to shut them down. The immediate reaction is a harsh backlash, usually accusing the author of parent bashing. "The parents" are off-limits. But, I'm going to go there.:)
I believe society started a trend from which there is no escape, and no turning back. Parents were allowed to relinquish the reins of raising their children. They relinquished them, handed them over to the schools, to the teachers. They stepped back, and said, "You take care of them!" You make sure my child eats, you make sure my child can read, you make sure my child gets enough exercise, you make sure my child doesn't bully anyone." The list is endless! How did this happen? They are parents, and that is a job that should be taken