luzula: a Luzula pilosa, or hairy wood-rush (Default)
[personal profile] luzula
I have been to a conference on existential risk to humanity today and yesterday. Basically I went because someone at my department was the organizer, and I was curious about the subject--it was aimed at a fairly popular (or at least multi-disciplinary) audience.

It was interesting but also super frustrating. This is best illustrated by a conversation I had with a leading AI researcher at the conference dinner. He was doing research on the risk to humanity from intelligent AI. I said that I don't know much about that, but if he thinks there is a risk I approve of him doing research on that risk and trying to lessen it. I then talked about things that I (from my perspective as being engaged in the environmental movement for 15 years) think are risks to humanity, such as climate change, loss of biodiversity, degradation of soils, overpopulation and overconsumption, etc. He dismissed all these and said that technology would definitely solve these minor issues and I did not need to worry. He also said that since there existed a scientific solution to climate change (stop emitting CO2) the problem was now trivial and he did not care about political stuff.

This is like telling a starving person that you have theoretically solved their issue (they need to eat) and the fact that they do not actually have any food is a trivial problem!!!

AAAAAAARRRRRRGH. I showed humility by acknowledging his potential risk, since it is not my subject. Did he show any humility by saying "well, you probably know more about environmental issues than I do, maybe your risks might be worth taking seriously as well, just in case." NO HE DID NOT. I found myself in the curious position of having much more in common with the economist from the business school who was on my other side--but then, he was actually working on environmental economics.

One of the speakers talked about how some risks were "sexy" (AI, aliens, etc) and got a lot of attention from the existential risk people while they ignored the mundane, "unsexy" risks such as the ones I listed above. Was it a coincidence that this speaker was a woman and almost all the other ones were male? I THINK NOT. I had a quite theurapeutic conversation with her afterwards where we vented at each other. She said that she had found that these techno-optimists had an almost religious worldview where it is very hard to reach them with arguments.

(no subject)

Date: 2017-09-08 06:36 pm (UTC)
desireearmfeldt: (Default)
From: [personal profile] desireearmfeldt
Yerk.

(no subject)

Date: 2017-09-08 09:29 pm (UTC)
cahn: (Default)
From: [personal profile] cahn
He also said that since there existed a scientific solution to climate change (stop emitting CO2) the problem was now trivial

AHAHAHAHA that's very cute! I mean, it's his own decision whether he wants to care about political things, and fine if he doesn't want to (I am not super inclined that way myself), but to say that the problem is trivial because there exists a scientific solution is completely... wrong.

Although he sounds like a certain class of academically-oriented people I know, where it's all about who postures best, and quite possibly the way to get respect from someone like that is not actually to show humility but rather to assert loudly, using as many facts (even when irrelevant) and logic (even when illogical) as possible to buttress your claim, that his problem is trivial and yours is much more important. (Though frankly incredibly unpleasant, and also the VAST majority of actual smart people, many in academia (as opposed to academically-oriented people who think they're smart) that I know are actually quite good at showing humility and interest in other points of view (which is a big plus when doing research).

(I know I"m not saying anything you don't already know. I'm just SO ANNOYED on your behalf.)

Also, I suppose this is kind of undercutting your point, but I find it hard to take people seriously who really believe in the risk to humanity from intelligent AI. I don't work in the field exactly, but I'm kind of tangential to it at times (have worked on learning algorithms, etc. in the past) and... it's just... it doesn't really make any sense to me to worry about it. It's like saying you're worried that dolphins will evolve superior genetic capabilities and enslave the human race. It's... I guess you could say that there's a nonzero chance it COULD happen? And maybe it might be interesting to think about and to wonder about how it might happen, and maybe a couple of smart people might want to think about it? But... let's just say there are a lot of other things that will get us first. I will say that I went and looked up some people and there ARE well-regarded people in the field that disagree with me about the size of the relative risks (I would put it at practically zero, but others would put it higher), but I think most would agree that many other things would pose a much greater risk in the short term.

Also, this "leading AI researcher" -- I just find it hard to believe that someone actually high up in the field is saying this kind of nonsense. The most important current risks from AI as I understand them are actually not totally different from the reasons why climate change is hard -- if an AI is put in control of things, what kind of objective function is the AI using to control whatever it is it's controlling (humans, human society, whatever), who designed that function with what inputs, what unintended effects did it/does it have, what is it ignoring. We already see this with learning algorithms and algorithmic ways of interpreting data -- think about using testing for grading teachers, for example.

Ugh, sorry for the rant, this is just pushing all my buttons.

She said that she had found that these techno-optimists had an almost religious worldview where it is very hard to reach them with arguments.

Yeah. It's really a little odd.

(no subject)

Date: 2017-09-09 03:17 am (UTC)
cahn: (Default)
From: [personal profile] cahn
Okay, he might not actually be a leading AI researcher, maybe rather that the speakers in the conference are big names in existential risk from AI. Which is maybe not the same thing.

Now this I actually believe. I feel like there are a bunch of "big names" in existential risk from AI who are sort of these people who have self-taught themselves from what they think is reasonable... and, I mean, the problem with this is that for every Einstein who figures out relativity in a patent office there are about a thousand people who, because they haven't gone through what's already known and why it's known, come up with ways to exceed the speed of light and think they're all brilliant. But anyway.

And yes, the other thing I'd append to what you said (all of which is so true) is that people are MUCH harder than physics, in general. Like, how do you get a bunch of people to work against their self-interest? What does self-interest even mean in this context? (e.g., there are tribal-ish loyalties that start coming into play -- it's all very complicated!)

(no subject)

Date: 2017-09-09 08:15 pm (UTC)
cahn: (Default)
From: [personal profile] cahn
Possibly I am exaggerating slightly in the above examples. But only slightly.

I... don't think you're exaggerating a great deal. And, I mean, I am speaking from a perspective here where I actually kind of understand where they are coming from: if you are a sheltered technical person that likes everything to be consistent and completely rational, and you construct an objective function for humanity (by constructing a suitably smooth abstraction for "humanity") that has an extremely large value for "gets away from earth," then sure, it makes perfect sense! And I get this because I grew up as a sheltered technical borderline-Aspergers person who made those kind of judgments. (And, hilariously to me now, actually wrote a MacGyver future!fic as a teenager based on basically the premise that this was the future implemented by a shadowy conspiracy government of highly competent megalomaniacs. But anyway.) ...But then I grew up.

And probably a further difference is that I think any effort to colonize space will 1) most probably fail because we can't survive without the ecosystem we are part of, and 2) exhaust resources that we need here on Earth.

This is interesting to me! I am not sure about (2) (I mean, you're right, but does it depend on who's making the decision about "need") but (1) seems extremely reasonable and I feel like I don't hear people talking about it.

and also: there was this guy who said that in a seminar next week, he would present a proof that HUMANS HAVE NO VALUES.

...Heh. My response is, again, that's very cute. HUMANS ARE ACTUALLY KIND OF DIFFICULT TO MODEL, YO.

ARRRRGGH

Date: 2017-09-09 12:57 am (UTC)
jesse_the_k: From "Hamilton" the key phrase "Everything is Legal in New Jersey" (HAM NJLegal)
From: [personal profile] jesse_the_k
There is a scientific solution so it's trivial? Reeeeeeeally?

This reminds me of the joke about the physicist, the engineer and the mathematician who encounter a fire in the kitchen.

The physicist determines how long the fire's been burning, the volume of combustibles, calculates the expected duration of the fire, finds a fire extinguisher and puts it out.

The engineer grabs the fire extinguisher & puts it out.

The mathematician glances at the fire and at the extinguisher, says, "There is a solution!" and goes back to bed.

(no subject)

Date: 2017-09-10 08:13 pm (UTC)
blnchflr: Sue Storm Civil War (Sue Storm Civil War)
From: [personal profile] blnchflr
That does sound right annoying >:[

Relatedly in regards to women/men, I was at a conference recently with only one female speaker, who commented on the fact that she was the only female speaker - unfortunately she then went into how the other invited female speakers had cancelled, because the name tags had pins, which ruin silk shirts (which she was wearing) - she may have been going for a metaphor about men arranging conferences, science etc. from only their own perspective and with only their own needs in minds, but it mainly ended up sounded like she was whining about her shirt being ruined by the pin. And sadly I got the impression it made the men in the audience dismiss her actual point which was she was the only female speaker inbetween 7 male speakers!

(no subject)

Date: 2017-09-11 03:01 pm (UTC)
lian: Klavier Gavin, golden boy (Default)
From: [personal profile] lian
Interesting conversation. You might enjoy following Pinboard guy, if you're on Twitter, or his rant on the subject on idlewords.com He's acerbic, but shares your general take. (As do I, because I've sort of had it with the hubris and star-gazing short-sightedness of techno-utopians esp of the Califoranian persuasion.)
Page generated Jan. 10th, 2026 12:05 pm
Powered by Dreamwidth Studios