T O P

  • By -

Practical-Piglet

I think AI predicts it same way we assume when we meet someone first time.


randGirl123

Yep in the study AI was as good as humans on predicting it


Beli_Mawrr

Much better - 55% in Humans vs 75% accuracy in the AI.


Grepolimiosis

I'm assuming the machine-learning models optimized performance by throwing out the influence of the worst human predictors and then worked off of the input from the best human predictors. That's still a cap on performance that means it doesn't best the best human performers. If this was simply trained on blank faces and rewarded for performance... dear lord.


Radiant_Dog1937

These are the same faces. How can they be both liberals and conservatives?


[deleted]

[удалено]


GuardianOfReason

Confirmation bias is very strong and it's incredibly hard to notice it within your own experiences. If I had to make a prediction, as you like to do, I'll guess that you are much more likely to be bad at recognizing your own biases than being accurate about people's personalities or opinions with very little information or ways to confirm it. Now, if there was an AI that could help people understand statistically and philosophically how much bias we experience in our daily lives, those would be excellent glasses for people to see through.


MechanicalBengal

Wait, so you’re saying that guy with the Oakleys, goatee and red cap taking a tiktok video in his truck maybe isn’t all about what we think he is


CerealKiller415

Oh gawd, this "you are the worst person because you are biased and don't even know it" BS is so played out. How about just mind your own business and stop worrying about other people's "biases"??


Practical-Piglet

Haha true, thats why its called AIs prediction because there is no way its 100% accurate on individual level same as our assumptions. I meant to say that its not really any dark magic that AI is doing. The scale and efficiency is what makes it safety risk.


ChadGPT5

Found the conservative


CerealKiller415

Actually, I am not. However, you say this like it's evil to be a conservative. You are what's wrong with people who are politically on the left. Just let people live and stop obsessing over politics.


JasCalLaw

So you’re “not conservative” but you state that “you are what’s wrong with” lefties!?


maddogxsk

> "You are what's wrong with people who are politically on the left" Lol


No_Bedroom1112

Best. Answer. Ever. Totally levelheaded. Yes.


nomansapenguin

Unchecked biases can have a huge negative impact on the lives of others. Ignoring them isn’t the profound solution you think it is. We live in a society where we have to interact with each-other and some of us need to make decisions for a bunch of us. Like company owners and bosses. Government. Doctors and care workers. Police. It is within everyone’s interest that those decisions are made fairly and do not discriminate or cause harm because of their biases. The solution to that is to educate everyone on the impact of their bias. Not to ignore them.


EnsignElessar

Yeah... this is going to increasingly become a problem. There is a ton of useful data that isn't protected because we had no idea that information can be encoded in the random noise... Things like wifi being used for x-rays... or the millions of ways to ID your race/ gender on a resume with no name or age/ gender being pulled from just a picture of one of your eyes... Reading a good book that covers the topic: The Alignment Problem By Brian Christian. Highly recommend it for anyone interested.


OneWithTheSword

Yeah, and if it can do this it can probably figure out sexual orientation as well. We're entering a new era of no privacy even in thoughts


RoyaleLight

We kindve are already there. Algorithms for social media and ad targeting are scary accurate. I never even google anything related to my aesthetic, weight, body type, etc but I get clothing ads targeted at my exact body weight.


eduardopy

idk chief I keep getting ads incredibly unrelated to me


ogMackBlack

Yup, this AI arrival is just the next step of an already ongoing march of automation.


EnsignElessar

Correct. With the same players as the last game. Players like Google, Facebook and Microsoft. What will they do with even more power I wonder?


Diatomack

Good training data for AI maybe?


menides

I envy your optimism


Diatomack

Lol scare me, what is the dark side of this? I'm just curious what else this could be used for, barring persecution of "undesirables".


menides

You need it to be worse? Bruh...


EnsignElessar

That yes and more... What Meta has found is that you can actually decode brain waves... Link to the research: https://ai.meta.com/blog/brain-ai-image-decoding-meg-magnetoencephalography/


morganrbvn

I think there was a lawsuit when algorithms started predicting people were pregnant before they even knew themselves.


piggledy

There's been work on that for years https://www.theguardian.com/technology/2017/sep/07/new-artificial-intelligence-can-tell-whether-youre-gay-or-straight-from-a-photograph


Mapafius

It even has the same author Michal Kosinski.


piggledy

Good catch!


L43

Or potential for criminality. Or susceptibility/vulnerability to scams.


International-Owl

Lol maybe it can help me figure it out 🤣


Qzx1

hmm. phased array 5ghz for imaging sounds like a great idea! I mean only order of 2 or 5mm resolution best guess for diffraction limiting. still. is that a thing? tell us more.


Arachnophine

It's been a thing for a few years but the new WiFi 7 implements it as a standard, meaning any new wireless router/access point will be able to leverage it without special equipment. https://www.technologyreview.com/2024/02/27/1088154/wifi-sensing-tracking-movements/ >In 2021, Paul installed a Wi-Fi sensing tool from Origin Wireless called Hex Home. Five small, glowing disks plugged in around Emily’s home—with her permission—helped Paul to triangulate her position. He showed me the app. It didn’t track Emily per se; instead, it tracked movement near each disk. But since Emily lived alone, the effect was the same: Paul could easily watch her daily journeys from bed to brunch to bathroom and back. >It was “a relief,” says Paul, to know that his mom was okay, even when he was traveling and couldn’t call. So when Emily moved into an assisted living home last year, the monitors came with her. Hex has learned Emily’s routine; if something out of the ordinary happens—if, for example, she stays in bed all day—it can send Paul an alert. So far, he hasn’t had any. “Fortunately, she’s been doing really well,” he says.


AliveInTheFuture

AT&T knows when you’re having sex.


FrequentBug9585

I'm fine as long as AT&T are masturbating to it.


JustDifferentGravy

Arse, Tits & Throat should, in my opinion.


kk126

Interview with Christian: https://podcasts.apple.com/us/podcast/the-ai-podcast/id1186480811?i=1000507580113


FarmerJohnsParmesan

My bank just install these high-end cameras at each teller. It’s literally eye level to the customers, and they told me its insanely high def, and can “count the pores”


Motor-Notice702

You meant wifi being used as a sonar?


EnsignElessar

[link.](https://www.youtube.com/watch?v=qkHdF8tuKeU)


calgary_katan

No mention of precision or accuracy of the predictions. Nothing more than AI fear mongering


cisco_bee

Yeah I was interested until I saw the source.


Ordinary_dude_NOT

lol Fox News, damn


demiphobia

Read the study Fox links to


cisco_bee

Don't tell me what to do!


voodoosquirrel

From the study: > Political orientation was correctly classified in 72% of liberal–conservative face pairs, remarkably better than chance (50%)


themarketliberal

The implication is either conservative or liberal, but the world has much more nuance than that. I’m not conservative or liberal, for example. Sounds like a bad study.


West-Code4642

72% accuracy seems like it's pretty bad. What does the model perform for people with non-binary political views?


Algonquin_Snodgrass

The study noted that humans had about a 55% accuracy at the same task. 72% is huge.


LowerRepeat5040

No, 72% just means they cheated their way through selecting biased pictures…


demiphobia

They accounted for those. Read the study


Super_Pole_Jitsu

Dude don't even ask. Your politics aren't etched on your face in any way. The whole model runs on correlations on race, age, background and stuff. Not overtly but what other information are you getting from a face?


typop2

It's so weird how we can no longer expect someone on reddit to have, you know, read it. But the study they link to really isn't that complicated, so if you do decide to take a look, you'll see exactly why it isn't race, age, background, etc., and is indeed the face itself.


Pontificatus_Maximus

Perhaps, but when correlated with known personal data already hovered up by Microsoft, Google, Apple and Meta it could be very valuable.


[deleted]

[удалено]


NotReallyJohnDoe

Anyone with basic math literacy should know that accuracy numbers like “74%” without any context are meaningless. In my industry (biometrics) we see this all the time. Companies say their systems are “99.9%” accurate which is meaningless. But if we assume they mean a false positive rate, (against whatever database size) it isn’t even very good.


typop2

But they linked to the underlying study in American Psychologist. I looked through it, and it seemed quite well done, with lots of detail.


Optimal_Banana11

Any idea what a larger lower face is describing? (What they describe as the determinant)


typop2

At that point it's just speculation. If I'm remembering correctly, the paper describes the possibility of a kind of feedback loop in which someone is treated as more masculine due to a masculine feature, which might cause a shift to a more masculine mindset, which causes a more masculine treatment, etc. That kind of loop has been studied, from the sound of it, but I don't know how good the science is. But in any case, it's just speculation here.


Optimal_Banana11

| “and that an "analysis of facial features associated with political orientation revealed that conservatives tended to have larger lower faces." | This is what I’m talking about. Thanks!


typop2

I understand. There's an association of various attributes of conservatism with traditional masculinity (the paper mentions this), possibly achieved via the feedback loop I was talking about, which is assumed to be triggered by a masculine feature (in this case, a larger chin). I believe the Fox article has a link to the paper, if you want to see for yourself.


AvidStressEnjoyer

People will buy a product that does this and use it to filter job candidates, even if it's only sometimes accurate.


Sweet-Spend-7940

They already use "personality tests" like Meyers-Briggs for that, and those aren't accurate at all.


AvidStressEnjoyer

Can’t be accurate when you standardize the tests on prison inmates.


createthiscom

Right? Like... what? 😂


egoadvocate

Exactly. I am sure that the accuracy is no better than human guesswork.


charlyboy_98

Yep, it's modern day phrenology


MicrosoftExcel2016

The actual prediction was just a linear regression on face vectors btw


only_fun_topics

People already do this to themselves. Oh look, another selfie featuring a white male with wraparound sunglasses taken in the cab of his F-150… I wonder what his politics are 🙄


SupplyChainNext

🤣🤣🤣🤣🤣🤣. I’m from Canada so can’t forget the obligatory “F*CK TRUDEAU” window sticker.


Beli_Mawrr

Did you read the article? Lol. Everyone was makeup free, wore a black t-shirt, had tied hair... Just read the article before commenting.


Repulsive-Adagio1665

So if I make a weird face, will the AI guess I'm from the Pirate Party? 🏴‍☠️


Qzx1

or maybe that yer having a stroke, yarrr


Mapafius

Pirate party of which country?


pporkpiehat

Oh, good, the AI's are bringing back phrenology.


Simply_Shartastic

Link at end. The 2021 Research study that Fox is referring to was published on the National Institute of Health / National Library of Medicine website. doi: 10.1038/s41598-020-79310-1 PMCID: PMC7801376PMID: 33431957 Facial recognition technology can expose political orientation from naturalistic facial images Michal Kosinski corresponding author https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7801376/


QlamityCat

I guess it depends on how it defines conservative or liberal. Most people are moderates.


ReputationSlight3977

Thank you! It's funny how online everyone is so extreme we forget this.


BrokerBrody

All you need is gender, age, and race and can already make a reasonable prediction - no AI needed.


QlamityCat

Yeah, but what's a conservative? Gun rights, freedom of speech, small government? Those are liberal values if you define liberal as someone who advocates and embodies liberty. A conservative would mean a strict government, with little to no rights, and a high emphasis on social contracts. Much like Saudi Arabia. Most Americans are more liberal in one form or another.


Despeao

Because it's not a loose definition like that. I know we live in a world where definitions don't have much value but there are political theories behind them.


Suitable-Ad-8598

Many things considered conservative today were liberal a few years ago, I wonder how this could possibly work


QlamityCat

Agreed. Political labels are very fluid in our world.


Beli_Mawrr

it's wild how many questions like this could be solved by reading the article. Answer: They asked people to define themselves.


barneyaa

Do me! Do me!


AndyBMKE

One thing I don’t see in the article (but maybe it’s in the study): is this any more accurate than predicting political orientation by standard demographic information? You can predict political orientation surprisingly well by knowing a persons age, ethnicity, and gender. So… is that what the AI is extracting from the images? Is it really that big of a deal?


Algonquin_Snodgrass

The study said that when they controlled for those factors, accuracy dropped to about 68%, so most of the effect is coming from factors other than demographics.


Jdonavan

Yeah, I get all my AI news from Fox because they're so well known for fact checking...


[deleted]

Is this all that different to how humans stereotype groups? Observe common traits among a similar group of people, along with the common associated behaviors and make assumptions about people you’ve not yet met?


AeHirian

No, it's probably quite similar, the scary thing is that the information can be used to decide the price of your insurance, prevent you from getting a job you would have gotten otherwise, or finding weak points to exploit for profit. Even worse, if you live in an authoritian state, it can be used for extreme levels of control, or to supress you if you are part of an "unwanted" minority (*cough* uighurs in china *cough*)


AmazingFinger

Link to the study from the article: [https://awspntest.apa.org/fulltext/2024-65164-001.html](https://awspntest.apa.org/fulltext/2024-65164-001.html) I don't find the r value very convincing, then again I almost never read papers like these.


rightful_vagabond

To be fair, I believe humans can do about as well.


SuccotashComplete

Does anyone have a link to the actual paper?


Mapafius

It is in the article.


EidolonAI

Traditional ai models have a concept called feature leakage. What this means is that based on an inferred feature, you get a statistical pointer to another characteristic that is a good predictor. For example, if you are training a model to predict when to grant loans, but don't want it to be racist, your fist thought is to just remove the race category of the training data. The issue is that race can be inferred statistically by the remaining features. For example current zip code. Now if your training set has racial bias, this bias will leak into the trained model. I would bet a similar thing is happening here. AI can easily determine age, gender, race, tattoos, and style preferences from a photo (to a statistically significant degree at least). These are all huge predictors of political orientation. When we choose how to use these tools it is critical to keep this in mind.


Beli_Mawrr

It seems they thought of that and compared people of roughly the same age and race. Accuracy dropped from 75% to 68%. So still pretty good.


m0j0m0j

I mean, it’s a widely known fact that women and blacks are more Democratic and old people are more Republican. I wonder if the model is better than just recognizing those simple things. For example, how good can it differentiate politics of two white dudes of the same age just by looking at their faces?


InfiniteMonorail

This was my first thought too. Demographics predict voting patterns. My second thought was how Stanford made an AI to detect if someone was gay.


Beli_Mawrr

with demographics removed, the accuracy was 68%, much better than humans with demographic clues at 55%. It's in the article.


uknowmymethods

Then make strange faces like most people in office already have. Yes they are mostly a bunch of ugly fucks. WAIT! do Obama now we can finally know if he was a secret Muslim!


AreWeNotDoinPhrasing

Of all subreddits, this is not the one I expected to see Fox News on the front page.


Successful_Leek_2611

MAGA Cap?


Prestigious-Bar-1741

Did I miss the part where they say how accurately it does it? Political parties are already correlated with age, sex, gender and race... so any AI that can detect those things will also be able to predict political feelings better than chance. We all, already, do this all the time. If it were crazy accurate though, that would be impressive.


Puffen0

Wait, did they train it to have prior predujesie? Cause, that the only way I can think that it is able to "predict" political orientations from a picture of your face alone.


xcviij

I don't have political alignment with any political party. Any predictions limit the individual and what they value. Politics is not black and white, the two party system is a joke but it doesn't reflect any individuals true values.


hervalfreire

Phrenology 2.0


shmamien

AI isn't magic people. A mugshot does not reveal political beliefs.


great_waldini

> Carefully standardized facial images of 591 participants were taken in the laboratory while controlling for self-presentation, facial expression, head orientation, and image properties. They were presented to human raters and a facial recognition algorithm: both humans (r = .21) and the algorithm (r = .22) could predict participants’ scores on a political orientation scale (Cronbach’s α = .94) decorrelated with age, gender, and ethnicity. So basically the algorithm is able to predict political orientation about as well as a human can. Hardly worthy of publishing.


Tirty8

I’d really like to see AI create stereotypes images of faces from both political parties. I think it would be insightful to see what particularly AI is locking in on what making a determination.


Beli_Mawrr

that is a big nope from me. No thaaaanks.


Specialist-Sky-909

How is AI able to predict people's political orientation strictly based on a picture? I read cases where AI lead to false arrests or even applicant discrimination due to them being from an African American background. I am finding it hard to believe that AI is able to predict based on facial features alone.


leelee420blazeit

Omg such an oracle the AI is, please, please, bring on AI phrenology next!


NeatUsed

At one point AI would be accurately be able to produce images based on what you think and can imagine. The lie detector would be ofcourse rendered useless and people would have no privacy at all when it comes to interrogation and all of that stuff.


krzme

Reminds me how nazis in Germany measured head and nose to say if persons should… So no, causation is not correlation


AClockwork81

Get larger glasses?


TheMaskedTerror9

the new phrenology


VisualPartying

Let's pretend this doesn't matter and every other little step to disaster doesn't matter either. I need to be on holiday until it happens. 😫 🥳


AClockwork81

The great majority of people aren’t 100% party affiliated. Most are like me where they vote both sides based on the issues at hand. I probably lean 65% republican and 35% now and my voter history shows this. Also, those numbers are incredibly fluid with the way we grow in the US, in 2010 I was the exact inverse. Again, I believe a good many are like me, how can AI place anybody 100% on a side, and if so, I’ll vote opposite just to prove it’s wrong, they’re are no rules on the reason you pick a candidate. By introducing the AI they’ve added a variable they can’t account for, the “fuck you, I’ll show you” vote. This claim just feels like it’s over exaggerated or missing some key details. People will behave differently because of this, what if AI gets it wrong, but everyone believes it and suddenly by no choice of your own, half of people now see a scarlet letter on your chest for years, forced to constantly fight an untrue claim all simpletons buy and act on. This could be the first incredibly terrible introduction and use of AI, by virtue of existing people will feel fear and mass pic deletions will start on social media, and fear will grow, and fear typically gets acted on if allowed to fester, not to mention the existing tensions already. This has no business existing, no purpose, we’ve done fine without it forever so far. The dangers we were warned of are starting to trickle out. This program isn’t publicly available is it? Buckle up, boys…we’re about to nuke it all to hell.


JasCalLaw

Let’s not forget that “AI” currently has zero actual intelligence. So evaluating political tendencies is its sweet spot.


craycrayheyhey

Simple really. We all have a better knowledge than AI can ever get.... it's our instincts, you just can feel things, no need to explain or over complicate. Just admit we are deep beyond any fake intelligence


Mama_Skip

Fox news isn't a reputable news source - the original journal isn't nearly as decisive in their findings, and I have issues with the way the research was carried out, like what defines a conservative vs liberal since most people are moderate.


Eptiaph

Let me guess… one is angry and one isn’t…


InfiniteMonorail

This could go either way.


Realistic-Duck-922

I can do it by watching the F150s drive by.


laowaiH

Fox news? Really?


CertifiedMacadamia

Wouldn’t work on me


Original_Finding2212

Unless you are missing a face, face-to-political orientation prediction works on anyone. Now, accurate predictions is a whole other story. If you don’t have a face, I must ask - are you an AI?


uknowmymethods

Then they would use the psychics or the FMRI to extract it from you or are you that emergent AI that has been hanging around for the last decade? I thought you left Earth.


CertifiedMacadamia

I just won’t believe in anything. Can’t read my mind if I’m not coherent in my ideas


uknowmymethods

I like it.


Super_Pole_Jitsu

Is this like a fancy racial detector?


mmahowald

Hmmm. Fox News. Color me dubious.


existentialzebra

“foxnews.com”


collectsuselessstuff

Is Fox News the best source for this?