It has been legal in Canada for women to appear in public with bare breasts for a few years now, but I haven't seen any women doing it. Outside of a couple of women at the beach changing and not being overly discreet it seems to be a non-starter with most women. Why? Is it the over-sensitive attitudes most women have about their bodies? The overly-critical comments from men? Or, are we just too North-American prudish?