You know what the friggin easiest solution would be?!
So the Storm touch screen has lots and lots of sensors under the surface to sense our touch, right? Well... right now, when the storm senses a small heat source against its screen... it sees it as our finger... and acts accordingly. ie: it will highlight the screen blue... etc.
But when we hold the phone using our cheek and shoulder... the storm senses a large heat source covering a large area of the screen. And right now, it seems like the Storm still sees that as a finger...and either selects the mute or speakerphone button.
But what it SHOULD do when it senses a LARGE heat source contacting the screen, is assume that it's our face, or leg (when in the pocket) and simply have it lock the screen until that large heat source is no longer there...ie: when we put the storm to our face = screen feels large heat source and locks... and when we take it away from our face = screen no longer feels large heat source and unlocks.
I can't believe RIM didn't think about this.