I am not a fan of the idea. Moderating content requires not only incredible patience but emotional intelligence. It's a lot easier to spot a blatant violation of the rules than to know how to properly react to it. Over the years we had quite a number of members who have been fantastic at scouting and reporting violations. At the same time, they were also getting emotionally involved (getting themselves in a confrontational direct conversation with the violator, overreacting to all of their activity or jumping to conclusions way too early). It is also important to read between the lines, spot accidents and know when and how to give members a break. I'd prefer to control these processes.
I can only accept a hybrid-type solution when certain things are automated. As Conor mentioned, if a post receives an X number of reports (although easily gamed if a certain group of people is on a mission to intentionally bring another member down), it can at least temporarily get suspended for further review by staff.