Don't flip it. You can never have certainty about the consequences of your actions -- perhaps things won't turn out as you hoped. You can, however, have certainty about what you intend to be the outcome of your actions. Since you only have control over YOURSELF, you can't be held accountable for the deaths of the five people. But if you flip the switch, you are responsible for killing a human being. The moral choice is to not flip the switch, but do what you can to rescue the people in danger. Killing the few for the putative good of the many is not moral.
You don't know whether saving the five will in fact lead to more suffering overall -- perhaps they are evil people, or for some other unforeseeable reason. What you do know is that pulling the lever will kill someone. This is an extremely counterintuitive position, but I think it's the right one.
Think of it this way: What if there were one million people on the first track, and 999,999 people on the second? Would you flip it then? I would argue you should go to jail if you flip it in that case. Utilitarian quantification of good/bad is absurd.
Moral systems that hold you responsible for the material consequences of what you do are misguided. If a stone rolls off a hill and crushes a baby, that stone isn't a murderer. All we can know for sure is that willful murder is wrong, so flipping the switch is certainly wrong. Perhaps the five people on the track are just hallucinations. Perhaps the train will be derailed before it hits them. You can never know things like this, but you can know what your responsibility is, and your responsibility is to never willfully kill a human being