About the fake reviews, amazon should immediately ban such sellers. I got one stupid product and i regret spending 2k INR on that. It doesnāt work at all. And people out there were like raving about it.
I add my agreement to Sharif's worries. Longtermism is a poorly conceived concept. One doesn't starve the present in order to feed a future that may never come to pass. We cannot predict future states of affairs in a complex phenomena like human society even ten years out, much less one hundred years, and a thousand years out is beyond the predictive power of even concatenated exascale computers. So to take resources from the present's limitrd supply to address illnesses that may never arise whether we tend to them or not, is sheer folly.
I agree with your point of allocating resources and the crazy amount of fund flowing is alarming. But isn't it wise to allocate some funds for the future population?
Yes but it's a mstter of degree I think. Suppose at the moment there is a tornado on the ground and we know it is heading in the general direction of a large city but will take some time to get there. So we take our resources and strengthen the large city's buildings and congratulate ourselves on our forward thinking.
But as it eventuates, the tornado swerves slightly and hits another large city, or perhaps it stays on course for the city we strengthened but lifts into the air or dissipates before it gets there?
Now wouldn't our resources have been better spent in taking measures to stop the tornado right now, where we detected it? At the minimum, by doing so, we would have protected all the other small villages, hamlets, farms, and towns it would strike before hitting that target city we worried about?
And suppose that for whatever reason our target city was abandoned or empty by the the tornado arrived?
The point is that all future states of affairs share a certain degree of indeterminacy and that condition only increases the further in the future we project.
Why, for an example, use our present resources to protect the human species from our sun's eventual nova in the far future when by that distant time our species may have become extinct for any number of reasons? Or no longer live here? Or may have developed superior technology able to protect the planet?
Longtermism's faultiness is indisputable,. Perhaps we should think more along the lines of "mediumtermism" š
Great to have you back. Good luck with the new job!. btw, I hope you are aware of all the negative criticism of 'longtermism'. This is not the the type of philosophy we need now. You can see a short summary here (https://sadnewsletter.substack.com/p/long-term-vs-longtermism).
The amount of money flowing in is alarming especially from the silicon valley circles. But I personally feel like some amount of research or resources should be allocated with the current or near-term issues as the top priority. What are your thoughts?
Some reviews are paid, you simply write a positive review from the table and get paid. You won't even try the product and get paid for the review. Amazon won't do anything about it. The only option is to write a buyer review.
About the fake reviews, amazon should immediately ban such sellers. I got one stupid product and i regret spending 2k INR on that. It doesnāt work at all. And people out there were like raving about it.
Same goes here. It has a reached a point where we cannot trust anyone be it youtubers or reviews! :(
I add my agreement to Sharif's worries. Longtermism is a poorly conceived concept. One doesn't starve the present in order to feed a future that may never come to pass. We cannot predict future states of affairs in a complex phenomena like human society even ten years out, much less one hundred years, and a thousand years out is beyond the predictive power of even concatenated exascale computers. So to take resources from the present's limitrd supply to address illnesses that may never arise whether we tend to them or not, is sheer folly.
I agree with your point of allocating resources and the crazy amount of fund flowing is alarming. But isn't it wise to allocate some funds for the future population?
Yes but it's a mstter of degree I think. Suppose at the moment there is a tornado on the ground and we know it is heading in the general direction of a large city but will take some time to get there. So we take our resources and strengthen the large city's buildings and congratulate ourselves on our forward thinking.
But as it eventuates, the tornado swerves slightly and hits another large city, or perhaps it stays on course for the city we strengthened but lifts into the air or dissipates before it gets there?
Now wouldn't our resources have been better spent in taking measures to stop the tornado right now, where we detected it? At the minimum, by doing so, we would have protected all the other small villages, hamlets, farms, and towns it would strike before hitting that target city we worried about?
And suppose that for whatever reason our target city was abandoned or empty by the the tornado arrived?
The point is that all future states of affairs share a certain degree of indeterminacy and that condition only increases the further in the future we project.
Why, for an example, use our present resources to protect the human species from our sun's eventual nova in the far future when by that distant time our species may have become extinct for any number of reasons? Or no longer live here? Or may have developed superior technology able to protect the planet?
Longtermism's faultiness is indisputable,. Perhaps we should think more along the lines of "mediumtermism" š
Great to have you back. Good luck with the new job!. btw, I hope you are aware of all the negative criticism of 'longtermism'. This is not the the type of philosophy we need now. You can see a short summary here (https://sadnewsletter.substack.com/p/long-term-vs-longtermism).
The amount of money flowing in is alarming especially from the silicon valley circles. But I personally feel like some amount of research or resources should be allocated with the current or near-term issues as the top priority. What are your thoughts?
Welcome back Rishi! There are times when you need to focus on other things, to preserve energy for another.
Thanks a lot for the support Nick!
Good to see you back Rishi! I missed your work.
Thanks a lot Swarnali for the support! :)
That color perception gedanken experiment is fascinating. A very interesting thought experiment on machine intelligence is Searle's Chinese Room
Some reviews are paid, you simply write a positive review from the table and get paid. You won't even try the product and get paid for the review. Amazon won't do anything about it. The only option is to write a buyer review.