Using the sun to sell us lies

More
#1 by SuperMiguel
Here's something I just can't wrap my head around. Okay so here's the deal, unless you've been living under a rock during the last decade, you've probably heard that the sun is bad for you. Getting exposed to sun ray's causes cancer and health stuff. Okay so cool, let's assume that's true. But on the other hand, we're told that we're not getting enough sun and we need to buy and consume vitamin to make up for it. How does that make sense? First of all, how does "getting exposed" to the sun, gives you vitamins? That makes no sense. And which is it? Is the sun important or is it bad? I think it's all bs to yet again just sell us things we don't need.

Please Log in to join the conversation.

More
#2 by LiamAtlanta
It seems you're taking a while to understand. We've already written a post on this subject.

The sun doesn't exist. It's a fiction invented by the powerful to run their business and keep better tabs on us. The sun is like a multi-projector that our ancestors created to better control people and their food supply. If you notice, everything revolves around the sun... do you find that normal?

Please Log in to join the conversation.

More
#3 by Aluminaughty
The sun was also used as an excuse to cover up the dinosaurs leaving. Everything about the sun is lie.

Dinosaurs didn’t die. They left.

And they're coming back.

Please Log in to join the conversation.

Powered by Kunena Forum