I grow my own produce during the summer, so I know for sure that it is organic. The best part though is that it tastes so much better than the store bought stuff!
Seems like I read somewhere that many organic fruits and veggies, grown here in the US, were tested and some came back positive for pesticides. Unless I grow it myself, I don't trust it to be organic. That isn't saying that there are no organic fruits and veggies in markets. The article was just stating that while some produce is claiming to be organic, it isn't always the case. I wish I could quote the source, but it was quite a while ago, and I don't remember where I read it. It could have been in the newspaper or in a magazine.