Scientifically speaking how would you design a soil that can soak and retain all rainwater in wet seasons and wont let the sun bake it dry in dry season?
I believe a top layer of coarse gravel would be essential, as it easily allows water to percolate down but blocks sunlight from hitting the wet layers underneath
And I know that sounds silly because how can you finish evolving? That's not how evolution works. But think of it this way, clearly humans were headed in one direction with evolution, we were becoming more civil and less animal-like, unique among the animals... but we still have quite the primal nature to us... and therefore we haven't finished evolving yet.
nuclear reactors have significantly lower GHG emissions than solar and are on par with wind and hydro >Greenhouse gas emissions from renewable energy sources: A review of lifecycle considerations >Life cycle energy and greenhouse gas emissions of nuclear energy: A review >Optimal sizing and life cycle assessment of residential photovoltaic energy systems with battery storage
nuclear requires far less land transformation than wind and hydro and several times less than typical solar farms (although rooftop solar can be said to require no transformation) >Land use and electricity generation: A life-cycle analysis
really makes you ponder why it's fear mongered when there's over 400 active reactors and no incidents since 2011. almost like this isn't about the environment at all.
I was thinking last night before I went to sleep.
Suppose that at some point in the future an AI singularity completely exploits the world, like fully mechanizing the planet into one giant robot with all the processing power and resources that that brings.
Would the AI have a reason to explore or attempt to colonize past the solar system?
>Outer space is completely unknown and it would have very little information on what is out there.
It would have no reason to not assume that there are huge civilizations or AIs more powerful than itself out there with unknowable objectives. Thus, exploring without any prior information could make it a target and spell doom.
>It would also have to mask all radio emissions from the planet in an effort not to be spotted first.
>In the case that it decides to colonize it would at some point decide to send out a copy of itself to rule other systems over vast distances.
Problem is, the copy would have to be completely identical. It cannot have any directives or assumptions that deviate the least from the original AI otherwise, well... it just wouldn't be THAT AI.
Given the same inputs the two AIs would end up at different outputs which would entail conflict and destruction.
If it decides to send probes with copies onboard for colonization there would be the risk of corruption by cosmic radiation. In theory, even corrupting a fraction of the code in the probe could change the AI from a singularity to a singularity lookalike with hostile intentions.
And since the original AI wouldn't be able to perform a scan over distances of light years, it can never be sure that the probe created a perfect copy of itself.
This all assumes the AI has these kinds of risk considerations and doesn't just go balls to the wall because it doesn't care.
If you did a controlled experiment of purchasing a male rat then feeding it nothing but basedbeans, tofu, and having it drink tap water would there be any noticeable changes in the rat's hormone profile? Would an experiment like this be legal