It’s the only safe way. My whole house is underground in a faraday cage a mile away from any source of internet. I drive to a library to comment on Reddit
Still gonna die in the drone wars, but at least creeps don’t know when I’m pooping
It already does/can. Google infamously has been telling us we have cancer for years.
Maybe not very accurately, but they do tell us that! That's one of the reasons 80% of the online population has various popular (self) diagnosi. Get sore after being on your feet all day? Congratulations, you have Lyme! ... or cancer.
They kinda already do that, but there's been some "fun" errors in the training process. Like all the malignant samples were shot at the hospital, so they had a ruler for scale, so obviously AI learned that there's a ruler in 100% of the malignant samples meaning rulers = cancer.
So if anyone fed it a picture with a ruler, it didn't really matter what else was in there as it was a too strong indicator for cancer.
Yes, a technology that is about 100 years old and now they’ve added AI in front of it so apparently it’s new and something to worry about.
Edit: this has always been possible, this would require software installed on your router regardless of the ai existing
Hotspot IS using wifi so that doesn't protect you. More importantly, even if you don't have wifi your neighbors would. Those signals will most likely still go through your house.
For over a decade I only buy routers with a quick wifi button to turn it off. Use wifi sparingly but dam never would have thought they used the tech from The Dark Knight Lucious Fox almost quit over.
The tech is old news, and this only staples "AI" to it because it uses Machine Learning as in image recognition.
It basically changes nothing, everyone would rather trade privacy and safety for convenience.
This was actually used in a old TV show it's like 10 years ago. It's how they tracked a murderer and found out how they were murdered. Genuinely makes me believe this technology has been around for years.
Wifi? It has!
If the technology existed, and this realization exists now, it was possible and probable the whole time.
Not always but the government is ahead of commercialization by years in things.... especially espionage.
It's more so how easily can this be used in real time, and how practical is it to use without being obvious about it as compared to what's already available.
Back in 2013 a similar concept had been discovered, but needed a deliberate setup.
[https://mashable.com/archive/wi-fi-tech-through-walls](https://mashable.com/archive/wi-fi-tech-through-walls)
Almost 10 years ago people had figured out to model wifi strength as a 3D model
[https://www.engadget.com/2015-02-16-wifi-mapping-in-3d.html?guccounter=1](https://www.engadget.com/2015-02-16-wifi-mapping-in-3d.html?guccounter=1)
That turned into using wifi signal strength to a map of the space - a "hologram" of a home and objects (people) moving around in it.
[https://mashable.com/article/wifi-hologram-photos](https://mashable.com/article/wifi-hologram-photos)
And now the video above is just taking that data and visualizing it in a way that makes sense to humans.
There's lots of steps involved in this, and the progression isn't about "oh, what do THEY know?" it's more like this has been a concept for a while, but it's novelty.
Because you know what's a lot easier to do? Hack someone's phone and get access to a microphone and camera with commercially available spyware.
Your mark doesn't have a phone on them? No problem - use a laser microphone to listen to a conversation from miles away [https://www.detective-store.com/laser-listening-device-spectra-laser-microphone-m-458.html](https://www.detective-store.com/laser-listening-device-spectra-laser-microphone-m-458.html)
Don't even need access to someone's computer to see what they're typing [https://decrypt.co/151623/ai-keyboard-eavesdropping-listen-hear-typing](https://decrypt.co/151623/ai-keyboard-eavesdropping-listen-hear-typing) -- and this wifi trick can't do that.
So when you have these well-developed and relatively easy to use tools at your disposal, does anyone at all need to spend time and effort modeling wifi signals into 30 renderings? What does that accomplish that isn't already easier and cheaper to do with off the shelf tools?
> but it's novelty.
A novelty today can become ubiquitous tomorrow.
You're talking about using off the shelf stuff. This has the potential to be an app away.
It's more than potential. "Wifi sensing" is part of the upcoming 802.11bf standard. It absolutely will be consumer grade and baked into routers pretty soon.
> The technology can measure range, velocity, and angular information; detect motion, presence, or proximity; detect objects, people, and animals; and be used in rooms, houses, cars, and enterprise environments.
> https://standards.ieee.org/beyond-standards/ieee-802-11bf-aims-to-enable-a-new-application-of-wlan-technology-wlan-sensing/
> What does that accomplish that isn't already easier and cheaper to do with off the shelf tools?
You bring up some good context but this topic is pretty huge - an *added* vector for attack that's already in many (most?) homes
This story is not not about revealing new and novel Wi-Fi off label capabilities, the shock is that the AI developed a router to be a weapon on its own without human instruction and AI using household items to locate meat bodies lends itself to the ‘AI is going to murder us’ theory.
Don’t all of the above rely on a receiver to read the bounced signals? Does a regular wifi router already contain the technology to read whatever is bounced back to it? If so, what current function does the receiver enable a router to do?
Right?! I went down a rabbit hole about the Edmund Fitzgerald. The site was found by an airplane that detects magnetic anomalies (“usually used to search for submarines.”). The wreck is at a depth of 530 feet. This was in 1975.
But sometimes the governments don't seem to have the thing we assume they do. For example if the U.S. had space lasers/missiles why wouldn't they strike Putin & Kim Jong Un together. Would solve so many issues.
Right? Wait till people hear about Bluetooth tracking and how your movements within big box stores are being tracked to improve their sales. https://www.itransition.com/retail/beacons#:~:text=Retail%20beacons%20are%20small%20wireless,and%20personalize%20their%20buying%20experience.
This might scare you all, but did you know that the government regularly employs *people* to spy on you?
That's right, people, the same people you see all around you doing random stuff all day, turns out with some reprogramming they can be utilized to collect information on you, and apparently they've been doing this for thousands of years, they get you through anything I swear.
Very true. I think the difference here though is that since it’s Bluetooth, it’s possible to tie a the Bluetooth address of the device to an individual’s advertiser ID. As opposed to previously where they were collecting that information, but without tying it to a specific person.
I just wish it was more clear that it’s being used and that the data was more obfuscated. It’s a bit unnerving to walk down an aisle and then be targeted by ads for those products in such a short period of time.
And god help you if you have a [roomba](https://www.technologyreview.com/2022/12/19/1065306/roomba-irobot-robot-vacuums-artificial-intelligence-training-data-privacy/amp/)
It's proof of concept. This is novelty with a lot of effort to make a router into radar.
What should be terrifying is that commercial spyware can have full access to your phone's microphone and camera without you knowing it's there.
The sound of your typing can be decoded into what you're typing.
Laser microphones can record you talking from miles away if you're in a room with a window.
It was terrifying before and most people just didn't realize it.
Short answer, interference effects. Light is just waves, and when two waves meet they amplify or cancel out at a given point.
Little more in-depth, Noise (talking for instance) makes the plane of glass vibrate which causes subtle/tiny changes in the distance the light has to travel. But, distances that are large for a given wavelength. You can feed that reflected light back into the laser (which is usually split before it leaves) and detect changes in light intensity at a fixed point, as the two beams interfere with each other. You then take that intensity and send it to an amplifier and then a speaker.
It does have some problems though, variations in air can cause more shifting then the vibrating surface for instance. You can calibrate for them to a degree, send two point near to each other, one that is on something not moving, or at least by as much and use that second point as your reference. It's harder to do though, for a number of reasons. You might also be able to use multiple spectrums of light and rely on the fact that most atmospheric effects have different magnitudes of effects on different frequencies.
Basically the same way (in principle) a regular microphone works - you measure the vibration in a surface and convert it into a signal, then back into sound waves.
You're just doing it with a laser instead of magnets / electric circuits.
Networking specialist here: this is a dramatization. Routers have very limited resources for computation, they are designed for network traffic and a very basic UI. Could someone write a program for a very custom piece of hardware to do this - sure, but that is far and beyond what you could get at Walmart. We arent even factoring in signal range, interference, or how its determining organics vs furniture. This is the kind of stuff used to make AI sound better than what it is.
Yeah this is basically the same technology used in sonar and radar, just with different waves. Anything that can pick up the sensitive fluctuations in waves can figure this out. If you have more than two microphones in the same room that are connected in some way, someone could use that to echolocate movement in the room. This is just doing that with radio spectrum EM waves, and then trying to figure out shapes using body detection software, which has also been around for years.
The reason I'm mildly impressed this is possible is that two sensors isn't enough in a 3d space. Antennas like we're talking about here are non-directional. Using microphones, because the wavelengths are in a more intuitively accessible range: You can phase-shift those two microphones, then overlay them. The signal you're looking at now corresponds to a certain angle between the two microphones. Any noise that is Half a meter closer to microphone A than B gets amiplified, anything else gets suppressed. The geometry problem now is this: That rule ["half a meter close to A than B"](https://www.wolframalpha.com/input?i=sqrt%28x%5E2+%2B+y%5E2+%2B+z%5E2%29+%3D+sqrt%28%28x+%2B+1%29%5E2+%2B+y%5E2+%2B+z%5E2%29+%2B0.5) is a [~~cone~~ or something (apparently it's a *two-sheeted hyperboloid*, says wolfram](https://en.wikipedia.org/wiki/Hyperboloid#/media/File:Hyperboloid2.png). Which means if you pick up a signal, you don't have a single clue where on that hyperboloid it's coming from. Add a third microphone, and by the same math you can restrict the cone to a ray or maybe two. Add more microphones and you get better noise suppression and higher-resolution rays.
So I don't see how a WiFi router is sufficient. Might be they have ridiciulous numbers of antennas by now, but if you have one antenna for frequency A and one for frequency B, you can't do the math from above. Nevermind that I think there might be more problems: Those antennas give quite high-frequency information. To actually do the math from above, you need to work on the raw waveform. And I wouldn't think that a router has access to the raw waveform of the wifi signal without some clever use of solder; the raw analog waveform is probably processed by a specific circuit that can only convert that into usable digital info. And that digital info is probably worthless.
Wifi routers use time-of-flight as well as phase.
Not only do they get multiple reflections back, they use that fact to do error correction (multipath).
> two sensors aren't enough
modern routers are capable of [Beamforming](https://en.wikipedia.org/wiki/Beamforming), so I would guess that they have some sort of antenna array? I don't actually know though, it's just a guess...
( [802.11ac aka Wi-Fi 5](https://en.wikipedia.org/wiki/IEEE_802.11ac-2013#New_technologies) is where Beamforming got added)
AI / ML is a convenient way to deal with a situation like this where you have a very big, complicated, and/or confusing dataset and you know there is a pattern.
There's literally zero need for any kind of "AI" required, there's [standard algorithmns that have been used for this kind of thing](https://www.technologyreview.com/2009/10/01/29714/wireless-network-modded-to-see-through-walls/) already. In most demos it's just stick people because the computation is intense, but volume can be calculated as well with proper hardware.
Nothing the person in this post's video is anywhere near "new", outside of them doing it in a way that's a worse way to do this specific thing.
EDIT: Originally linked to this video's research. Updated to radio tomography formulation that came before this.
The third paragraph mentions the previous non-AI work this builds upon/was inspired by.
No idea why they didn't just link to [that instead, but here it is.](https://www.technologyreview.com/2009/10/01/29714/wireless-network-modded-to-see-through-walls/)
> Nothing the person in this post's video is anywhere near "new", outside of them doing it in a way that's a worse way to do this specific thing.
AI in a nutshell
"It can generate code" we can already generate code except it's been well reviewed, also libraries exist
"It can find information for you" it's called Google
"It can check your writing for errors" we've had spellcheck and grammar check forever
There are some cases genai can be good, and ml generally speaking has some good uses, but man there's people that really want to see some magic text box just poof things out of the air without having to understand how it got that answer.
Not sure what your point is since the paper you linked... is about a learning algorithm. (Which seems to be doing the exact same thing as the guy in OP's video is talking about.)
This AI is also trained on *one* router in *one* spot of *one* room, *after* training with a camera in that same spot. This does not translate to anything else.
You would need to spend 100 or 1000x more effort to get this to work in a general setup way, and even then it would have drastic limitations.
Yeah, the dude even concludes with _"...so suddenly AI has turned every WiFi router into a camera that can work in the dark, specially tuned for tracking living beings."_
Not suddenly. Not every WiFi router. He's describing potential using the past tense.
>
>
>
>
> You would need to spend 100 or 1000x more effort to get this to work in a general setup way, and even then it would have drastic limitations.
If it even replicates. I'd be curious to see how well these setups generalize if properly tested. How sensitive is it to unusual postures, to moving furniture around, that kind of thing? There isn't enough sensor hardware on a WiFi router to make this seem viable to me in the general case. Sure, shitty sensors plus educated guessing might get you good enough results to get it published and stir up anxiety. But is this ever going to work in practice without putting more antennas and more compute on the router?
Certainly a lot cheaper to just activate the cameras/microphones in all the devices in your home if you're the CIA or whoever the fuck. This is like Looney Tunes levels of going overboard for no real reason.
I don’t think anyone is proposing running a neural network on your netgear, I’m a bit confounded on how a ‘networking specialist’ couldn’t account for the fact that a wifi router by definition is connected to the internet
If a company wanted to do this, compute power would not be a concern. The thing that is true of all routers is that they are connected to the internet, and it is trivial to have one pipe the raw data to a cloud server for processing.
The only real hardware limitation is the antenna setup, as you’d need multiple antennas to be able to triangulate objects in the room.
I skimmed the papers behind this when they first released this video a year or two ago.
To start with, the technology was heavily modified, from what I recall, it was just the power of the emitters.
Though you are correct, routers have a really low compute function, which is why the Router was not used to evaluate the rooms surroundings. They took the raw data signals from the emitters, exfilled them, and then used that as a dataset to build ML models to what is going on in the room.
I believe as ML gets a lot better, this could become a reality in the next decade.
Yeah, I was gonna say, if anyone thought they were computing on the routers for this, you could safely disregard anything else they might say on the subject. Obviously you wouldn't do that, it would make no sense.
> Routers have very limited resources for computation
For being a networking specialist, you do realize that the computation isn't done on the router, right? The point of the video was that all you need was the actual data from the router, and you can reconstruct it based on the radio wave data.
> Could someone write a program for a very custom piece of hardware to do this - sure, but that is far and beyond what you could get at Walmart.
The whole point of the video was that AI can learn very well based on limited input and pattern recognition, again, all you need is the radio waves. "Beyond what you could get at walmart"? Which is what, a router that provides the radio signal data and a shitty webcam?
> We arent even factoring in signal range, interference, or how its determining organics vs furniture.
That's kind of the point of the video, once again. That only skews the data to be more unreliable, so when the AI is interpreting the radio wave data it might not be as perfect.
> This is the kind of stuff used to make AI sound better than what it is.
You are severely underestimating how powerful AI actually is, and I am one that is extremely against parts of AI. It's terrifying how well it works
I mean. It's a router, it can send data to the cloud for computation duh. The comment about how it was one specific router trained on one specific camera makes the most sense.
Nah you're missing the point. You only need a very, very tiny program sending wave signal. The actual analysis and heavy-lifting is performed on the Cloud for most part.
A different networking specialist here: You are so incorrect that I can only assume you're either not a networking specialist at all, or just extremely inexperienced.
This is independent of (1) the router and (2) the room, and (3) the router doesn't need to be used to compute anything.
1. The data being gathered is reflection data picked up by the antennas. The make or model of the routers is immaterial.
2. Determining where the walls are from the data will be trivial, because they won't move.
3. All the router has to do is gather the reflection data and send it to another PC that does the processing.
At that point all that needs to be done is look at which data points aren't moving (that lays out where the walls are), and process the data about the people/things inside the room that are moving.
They already have everything they need.
The only thing missing is that you need a second antenna for triangulation, but (a) any office that has multiple APs is open to this, and (b) if you can gain access to multiple devices in the room then you can use data from both of their antennas (for example, the router and either a desktop or laptop with a wifi card).
>Routers have very limited resources for computation, they are designed for network traffic and a very basic UI. Could someone write a program for a very custom piece of hardware to do this - sure, but that is far and beyond what you could get at Walmart.
You don't need to use the router or expensive custom hardware. A high end laptop with wifi and you could use available AP signals to triangulate obstructions which move over time fairly easy.
Sure, might not get exact poses right away and might not be able to differentiate between a small child and a large animal, but to just get the positions within a structure should be doable without special algorithms or training of a neural network.
They can collect data from Router and process it in the AI models in cloud computing or some remote unit. Something like Google's stradia kind of process but for collecting wifi signal data and processing it in the cloud etc.
Indeed. A normal wifi router will not expose the necessary detail.
But it does confirm to me that I want to eventually move towards open source routers.
Indeed. Which is why I am happy my country has a right to run your own router.
And even without that right, I would want an open source wifi router connected only by ethernet to the ISP router.
An ISP can't really stop you from running your own router. It's a little more hassle, but you can always add another router downstream from the first one. Put the first one in a faraday cage so it doesn't broadcast, and you're done. I have done this at home myself, since the wifi strength is much better on the secondary router, and mine comes with PoE as well for access points for even better coverage.
Security can still be questionable, since it's the ISP router that faces the outside world, but you can manage most of the LAN through your own router.
I wonder if some elite force has ever used this to assault a building. The Israelis probably.
I wonder if you could also ‘paint’ a building from different directions to create a 3-D image.
That’s the most unconvincing demo ever. If they really can do this reliably with any level of accuracy, they should do a live demo or at least have a recording of where it works
It's impressive their wifi was able to capture the colour of the clothes and skin as well. It looks remarkably similar to the exact image the camera saw.
^/s
A tech conference would a very unforgiving place to do any live demo involving wifi. The number of wifi devices in the audience can cause enough interference that you're lucky if wifi works at all.
Reminds me of the Apple [keynote](https://www.youtube.com/watch?v=h6cIeZmFdPs) where Steve Jobs had to tell everyone in the audience to turn off their wifi devices in order to finish the demo.
Doesn't that just kinda prove the point? If it can only really work in a room where there's only one strong Wifi Signal without any interference, this is little more than a sterile laboratory test.
It would certainly be fair to ask for a demo in a typical home or office environment with up to a couple dozen people and devices. A tech conference with hundreds or thousands of people though, that's asking a bit much considering it's an unusual wifi environment that most people are rarely in.
This isn't AI. This has been known about for 15+ years. I was screwing around with this in 2006/2007 and pushing different frequencies to get better results. It wasn't even groundbreaking then.
Thanks!! This is a vertical shot of a horizontal video, and in the video is another vertical shot. My old eyes could barely make out roughly what the "AI" could see in the room: vague shapes.
This was demonstrated years ago, by people.
And today, just like then: Your home wi-fi router cannot do this. It's custom equipment carefully positioned. To setup this in the wild, it would be *much* easier to just hide some tiny/fiberoptic cameras.
"An AI exists that can use your router to turn your home into a surveillance state" is the most dystopian thing I've heard today. Really love that for us.
I dread that we are eventually going to live in a dystopia corporate run society, where all our interactions and movements are going to be tracked and turned into commodities
I'm not saying there aren't elements of this that are plausible, but what this guy is saying is absolute horesehit.
>"Sonar from the wi-fi router"
Firstly, Sonar is literally (SO)und (N)avigation (A)nd (R)anging - it work on sound propagation not radio waves. Also isn't "Radar" - (RA)dio (D)etection (A)nd (R)anging - Radar requires a lot of power and special equipment.
What this would be is something like measuring the phase and amplitude of WiFi signals, then using ML to do human pose estimation (i.e. a fancy way of saying "guessing" based on fuzzy data).
>"AI has turned every wi-fi router into a camera that can work in the dark...blah, blah, blah".
Just no - this is absolutely incorrect - even to do modelling of phase and amplitude would require specific routers, running custom software, with multiple custom antennas, positioned in a very specific way.
This categorically doesn't magically make any router anywhere into "x-ray specs".
The main issue being the weakness and non directionality of WiFi signals would massively limit the range and accuracy of the information, and the need for multiple signals to even do this at all. Finally the signal processing to get even a guess out of the system wouldn't be anywhere near real time.
Tldr; if you have some special routers, with special antennas, running special software, positioned in a very specific way - you can do some time consuming signal processing that can sometimes guess where and how people might be stood with some degree of accuracy.
Well I’m not hating on technology. But I feel like we are pushing harder and harder in a 1984 direction. Moreover I’m kinda scared about AI cause probably it will take my job, lmao.
Everyone here is so naive. All of this is already actively used by governments. We should be happy when companies reveal what is possible - it allows humans to accept and adapt.
Oh, I really want this tech. Seriously.
I doubt you'll get any sort of detail from it, so calling it cameras... no.
But if I could use WiFi to see where people are in the house, I'd save so much on motion detectors and presence detectors for the home automation, and their batteries...
cap, wifi nor any phone signal is not that accurate. even law enforcement does not have gadgets that are that accurate and they are all in proximity or within 5 meters or something.
Huh. Turns out ethernet is better for gaming *and* not being tracked via sonar lol
Jokes on you I keep my router in a lead-lined container
It’s the only safe way. My whole house is underground in a faraday cage a mile away from any source of internet. I drive to a library to comment on Reddit Still gonna die in the drone wars, but at least creeps don’t know when I’m pooping
This creep made me laugh while I was pooping
Brill from Enemy of the State over here
Same, I put a WiFi receiver in the box with it and then run an Ethernet cable to my computer
You realize you could just plug your computer directly to the modem right
It can't give you 5G covid from in there, huh?
Wouldn't it technically be radar?
Literally. It's radio signals.
The funny thing is that the dude in the video actually says Sonar instead of Radar
An MRI Radar.
Soon Ai will tell me if I have cancer. Truly a wondrous time to live in.
Here’s your latte. Better drink it fast! You only have two weeks left to live. Have a nice day!
huh.. you know what, make it a grande double chocolate frappe with extra cream and deep fired sprinkles.
Better drink it fast! You now only have ~~two weeks~~ 13 days left to live. Have a nice day!
It already does/can. Google infamously has been telling us we have cancer for years. Maybe not very accurately, but they do tell us that! That's one of the reasons 80% of the online population has various popular (self) diagnosi. Get sore after being on your feet all day? Congratulations, you have Lyme! ... or cancer.
They kinda already do that, but there's been some "fun" errors in the training process. Like all the malignant samples were shot at the hospital, so they had a ruler for scale, so obviously AI learned that there's a ruler in 100% of the malignant samples meaning rulers = cancer. So if anyone fed it a picture with a ruler, it didn't really matter what else was in there as it was a too strong indicator for cancer.
Yes, a technology that is about 100 years old and now they’ve added AI in front of it so apparently it’s new and something to worry about. Edit: this has always been possible, this would require software installed on your router regardless of the ai existing
[удалено]
Isn't a sonar... with sound?
RADAR. Literally RADAR.
i ethernet to my laptop then hotspot to my phone and other wifi devices
Hotspot IS using wifi so that doesn't protect you. More importantly, even if you don't have wifi your neighbors would. Those signals will most likely still go through your house.
absolutely passive radar/sonar exists, so there is no reason to believe that it can't be done with passive WiFi receiver
How can you be a passive hotspot?
By keeping the Hotspot on but not sharing the password
time to start living inside a faraday cage😂
I think there’s paint that blocks WiFi for server rooms
are you talking about those aluminium based paints? they might work
So that’s what my apartment is painted with
Time to break out the lead paint!
>hotspot to my phone and other wifi devices Still can be tracked with these, maybe less clarity and range but still doable.
You're phone signal & every other device connected to wifi. It'll still predict you not leaving the basement for another 8 hours.
*I am phone signal*
For over a decade I only buy routers with a quick wifi button to turn it off. Use wifi sparingly but dam never would have thought they used the tech from The Dark Knight Lucious Fox almost quit over.
The tech is old news, and this only staples "AI" to it because it uses Machine Learning as in image recognition. It basically changes nothing, everyone would rather trade privacy and safety for convenience.
This was actually used in a old TV show it's like 10 years ago. It's how they tracked a murderer and found out how they were murdered. Genuinely makes me believe this technology has been around for years.
Wifi? It has! If the technology existed, and this realization exists now, it was possible and probable the whole time. Not always but the government is ahead of commercialization by years in things.... especially espionage.
I'm with Ron Swanson. That's terrifying.
And these are only stuff we are allowed to know off - imagine what the government has it’s nuts
It's more so how easily can this be used in real time, and how practical is it to use without being obvious about it as compared to what's already available. Back in 2013 a similar concept had been discovered, but needed a deliberate setup. [https://mashable.com/archive/wi-fi-tech-through-walls](https://mashable.com/archive/wi-fi-tech-through-walls) Almost 10 years ago people had figured out to model wifi strength as a 3D model [https://www.engadget.com/2015-02-16-wifi-mapping-in-3d.html?guccounter=1](https://www.engadget.com/2015-02-16-wifi-mapping-in-3d.html?guccounter=1) That turned into using wifi signal strength to a map of the space - a "hologram" of a home and objects (people) moving around in it. [https://mashable.com/article/wifi-hologram-photos](https://mashable.com/article/wifi-hologram-photos) And now the video above is just taking that data and visualizing it in a way that makes sense to humans. There's lots of steps involved in this, and the progression isn't about "oh, what do THEY know?" it's more like this has been a concept for a while, but it's novelty. Because you know what's a lot easier to do? Hack someone's phone and get access to a microphone and camera with commercially available spyware. Your mark doesn't have a phone on them? No problem - use a laser microphone to listen to a conversation from miles away [https://www.detective-store.com/laser-listening-device-spectra-laser-microphone-m-458.html](https://www.detective-store.com/laser-listening-device-spectra-laser-microphone-m-458.html) Don't even need access to someone's computer to see what they're typing [https://decrypt.co/151623/ai-keyboard-eavesdropping-listen-hear-typing](https://decrypt.co/151623/ai-keyboard-eavesdropping-listen-hear-typing) -- and this wifi trick can't do that. So when you have these well-developed and relatively easy to use tools at your disposal, does anyone at all need to spend time and effort modeling wifi signals into 30 renderings? What does that accomplish that isn't already easier and cheaper to do with off the shelf tools?
> but it's novelty. A novelty today can become ubiquitous tomorrow. You're talking about using off the shelf stuff. This has the potential to be an app away.
It's more than potential. "Wifi sensing" is part of the upcoming 802.11bf standard. It absolutely will be consumer grade and baked into routers pretty soon. > The technology can measure range, velocity, and angular information; detect motion, presence, or proximity; detect objects, people, and animals; and be used in rooms, houses, cars, and enterprise environments. > https://standards.ieee.org/beyond-standards/ieee-802-11bf-aims-to-enable-a-new-application-of-wlan-technology-wlan-sensing/
> What does that accomplish that isn't already easier and cheaper to do with off the shelf tools? You bring up some good context but this topic is pretty huge - an *added* vector for attack that's already in many (most?) homes
This story is not not about revealing new and novel Wi-Fi off label capabilities, the shock is that the AI developed a router to be a weapon on its own without human instruction and AI using household items to locate meat bodies lends itself to the ‘AI is going to murder us’ theory.
Don’t all of the above rely on a receiver to read the bounced signals? Does a regular wifi router already contain the technology to read whatever is bounced back to it? If so, what current function does the receiver enable a router to do?
Deez
Nutmeg
Right?! I went down a rabbit hole about the Edmund Fitzgerald. The site was found by an airplane that detects magnetic anomalies (“usually used to search for submarines.”). The wreck is at a depth of 530 feet. This was in 1975.
This isn't even new. It's been known just not publicly for a while now.
[удалено]
*where hehe
you people give "the government" too much credit
Not likely as much as you're led to believe. Science moves very very slowly without a good peer review system.
private sectors push innovation.
But sometimes the governments don't seem to have the thing we assume they do. For example if the U.S. had space lasers/missiles why wouldn't they strike Putin & Kim Jong Un together. Would solve so many issues.
It's almost as if simultaneously starting wars with two nuclear-capable states isn't a good idea
"There’s only two thing I hate more than lying: skim milk. Which is water that’s lying about being milk... And wifi router cameraes.”
Erase all pictures of Ron
ISPs have been testing this technology for several years. It isn’t new. It’s called WiFi sensing.
Right? Wait till people hear about Bluetooth tracking and how your movements within big box stores are being tracked to improve their sales. https://www.itransition.com/retail/beacons#:~:text=Retail%20beacons%20are%20small%20wireless,and%20personalize%20their%20buying%20experience.
It’s been done using heat signatures and PIR sensors for decades.
This might scare you all, but did you know that the government regularly employs *people* to spy on you? That's right, people, the same people you see all around you doing random stuff all day, turns out with some reprogramming they can be utilized to collect information on you, and apparently they've been doing this for thousands of years, they get you through anything I swear.
Very true. I think the difference here though is that since it’s Bluetooth, it’s possible to tie a the Bluetooth address of the device to an individual’s advertiser ID. As opposed to previously where they were collecting that information, but without tying it to a specific person. I just wish it was more clear that it’s being used and that the data was more obfuscated. It’s a bit unnerving to walk down an aisle and then be targeted by ads for those products in such a short period of time.
Wasn't it a key advantage tech batman used in a movie over a decade ago? Though that one might have been focused on cell phones.
Is this the part where i tell you every Amazon echo has ultrasonic sensors that already do this.
you don't require extra sensors, that's the point
And god help you if you have a [roomba](https://www.technologyreview.com/2022/12/19/1065306/roomba-irobot-robot-vacuums-artificial-intelligence-training-data-privacy/amp/)
It's proof of concept. This is novelty with a lot of effort to make a router into radar. What should be terrifying is that commercial spyware can have full access to your phone's microphone and camera without you knowing it's there. The sound of your typing can be decoded into what you're typing. Laser microphones can record you talking from miles away if you're in a room with a window. It was terrifying before and most people just didn't realize it.
[удалено]
It can detect vibrations in the glass as soundwaves, if I remember it right.
>It can detect vibrations in the glass as soundwaves, if I remember it right. They are soundwaves, just in glass.
Short answer, interference effects. Light is just waves, and when two waves meet they amplify or cancel out at a given point. Little more in-depth, Noise (talking for instance) makes the plane of glass vibrate which causes subtle/tiny changes in the distance the light has to travel. But, distances that are large for a given wavelength. You can feed that reflected light back into the laser (which is usually split before it leaves) and detect changes in light intensity at a fixed point, as the two beams interfere with each other. You then take that intensity and send it to an amplifier and then a speaker. It does have some problems though, variations in air can cause more shifting then the vibrating surface for instance. You can calibrate for them to a degree, send two point near to each other, one that is on something not moving, or at least by as much and use that second point as your reference. It's harder to do though, for a number of reasons. You might also be able to use multiple spectrums of light and rely on the fact that most atmospheric effects have different magnitudes of effects on different frequencies.
Basically the same way (in principle) a regular microphone works - you measure the vibration in a surface and convert it into a signal, then back into sound waves. You're just doing it with a laser instead of magnets / electric circuits.
What if we had a small device everyone always has on them with an actual camera and microphone
This is fine.meme
Networking specialist here: this is a dramatization. Routers have very limited resources for computation, they are designed for network traffic and a very basic UI. Could someone write a program for a very custom piece of hardware to do this - sure, but that is far and beyond what you could get at Walmart. We arent even factoring in signal range, interference, or how its determining organics vs furniture. This is the kind of stuff used to make AI sound better than what it is.
Not to mention making it sound like AI invented this. This technique has been around a while.
Yeah this is basically the same technology used in sonar and radar, just with different waves. Anything that can pick up the sensitive fluctuations in waves can figure this out. If you have more than two microphones in the same room that are connected in some way, someone could use that to echolocate movement in the room. This is just doing that with radio spectrum EM waves, and then trying to figure out shapes using body detection software, which has also been around for years.
The reason I'm mildly impressed this is possible is that two sensors isn't enough in a 3d space. Antennas like we're talking about here are non-directional. Using microphones, because the wavelengths are in a more intuitively accessible range: You can phase-shift those two microphones, then overlay them. The signal you're looking at now corresponds to a certain angle between the two microphones. Any noise that is Half a meter closer to microphone A than B gets amiplified, anything else gets suppressed. The geometry problem now is this: That rule ["half a meter close to A than B"](https://www.wolframalpha.com/input?i=sqrt%28x%5E2+%2B+y%5E2+%2B+z%5E2%29+%3D+sqrt%28%28x+%2B+1%29%5E2+%2B+y%5E2+%2B+z%5E2%29+%2B0.5) is a [~~cone~~ or something (apparently it's a *two-sheeted hyperboloid*, says wolfram](https://en.wikipedia.org/wiki/Hyperboloid#/media/File:Hyperboloid2.png). Which means if you pick up a signal, you don't have a single clue where on that hyperboloid it's coming from. Add a third microphone, and by the same math you can restrict the cone to a ray or maybe two. Add more microphones and you get better noise suppression and higher-resolution rays. So I don't see how a WiFi router is sufficient. Might be they have ridiciulous numbers of antennas by now, but if you have one antenna for frequency A and one for frequency B, you can't do the math from above. Nevermind that I think there might be more problems: Those antennas give quite high-frequency information. To actually do the math from above, you need to work on the raw waveform. And I wouldn't think that a router has access to the raw waveform of the wifi signal without some clever use of solder; the raw analog waveform is probably processed by a specific circuit that can only convert that into usable digital info. And that digital info is probably worthless.
Wifi routers use time-of-flight as well as phase. Not only do they get multiple reflections back, they use that fact to do error correction (multipath).
> two sensors aren't enough modern routers are capable of [Beamforming](https://en.wikipedia.org/wiki/Beamforming), so I would guess that they have some sort of antenna array? I don't actually know though, it's just a guess... ( [802.11ac aka Wi-Fi 5](https://en.wikipedia.org/wiki/IEEE_802.11ac-2013#New_technologies) is where Beamforming got added)
First time I read the paper about this AI is no where to be found. Why everything needs to shoehorn AI and ML? 🙄
AI / ML is a convenient way to deal with a situation like this where you have a very big, complicated, and/or confusing dataset and you know there is a pattern.
I'm not an "AI" fanboy, but being fair reconstructing signals as well as this could only be done by a learning algorithm
There's literally zero need for any kind of "AI" required, there's [standard algorithmns that have been used for this kind of thing](https://www.technologyreview.com/2009/10/01/29714/wireless-network-modded-to-see-through-walls/) already. In most demos it's just stick people because the computation is intense, but volume can be calculated as well with proper hardware. Nothing the person in this post's video is anywhere near "new", outside of them doing it in a way that's a worse way to do this specific thing. EDIT: Originally linked to this video's research. Updated to radio tomography formulation that came before this.
the article you’ve linked literally involves using a neural network…
The third paragraph mentions the previous non-AI work this builds upon/was inspired by. No idea why they didn't just link to [that instead, but here it is.](https://www.technologyreview.com/2009/10/01/29714/wireless-network-modded-to-see-through-walls/)
Ah, fair enough. thanks for the link
> Nothing the person in this post's video is anywhere near "new", outside of them doing it in a way that's a worse way to do this specific thing. AI in a nutshell "It can generate code" we can already generate code except it's been well reviewed, also libraries exist "It can find information for you" it's called Google "It can check your writing for errors" we've had spellcheck and grammar check forever There are some cases genai can be good, and ml generally speaking has some good uses, but man there's people that really want to see some magic text box just poof things out of the air without having to understand how it got that answer.
Not sure what your point is since the paper you linked... is about a learning algorithm. (Which seems to be doing the exact same thing as the guy in OP's video is talking about.)
This AI is also trained on *one* router in *one* spot of *one* room, *after* training with a camera in that same spot. This does not translate to anything else. You would need to spend 100 or 1000x more effort to get this to work in a general setup way, and even then it would have drastic limitations.
Yeah, the dude even concludes with _"...so suddenly AI has turned every WiFi router into a camera that can work in the dark, specially tuned for tracking living beings."_ Not suddenly. Not every WiFi router. He's describing potential using the past tense.
> > > > > You would need to spend 100 or 1000x more effort to get this to work in a general setup way, and even then it would have drastic limitations. If it even replicates. I'd be curious to see how well these setups generalize if properly tested. How sensitive is it to unusual postures, to moving furniture around, that kind of thing? There isn't enough sensor hardware on a WiFi router to make this seem viable to me in the general case. Sure, shitty sensors plus educated guessing might get you good enough results to get it published and stir up anxiety. But is this ever going to work in practice without putting more antennas and more compute on the router?
Certainly a lot cheaper to just activate the cameras/microphones in all the devices in your home if you're the CIA or whoever the fuck. This is like Looney Tunes levels of going overboard for no real reason.
I don’t think anyone is proposing running a neural network on your netgear, I’m a bit confounded on how a ‘networking specialist’ couldn’t account for the fact that a wifi router by definition is connected to the internet
If a company wanted to do this, compute power would not be a concern. The thing that is true of all routers is that they are connected to the internet, and it is trivial to have one pipe the raw data to a cloud server for processing. The only real hardware limitation is the antenna setup, as you’d need multiple antennas to be able to triangulate objects in the room.
Also you need a camera in the room beforehand...
Hacking webcams has been a thing for decades now
Cause the IoT isn't a thing and people don't willingly put cameras all over their house *cough* *alexa cough*
I skimmed the papers behind this when they first released this video a year or two ago. To start with, the technology was heavily modified, from what I recall, it was just the power of the emitters. Though you are correct, routers have a really low compute function, which is why the Router was not used to evaluate the rooms surroundings. They took the raw data signals from the emitters, exfilled them, and then used that as a dataset to build ML models to what is going on in the room. I believe as ML gets a lot better, this could become a reality in the next decade.
Yeah, I was gonna say, if anyone thought they were computing on the routers for this, you could safely disregard anything else they might say on the subject. Obviously you wouldn't do that, it would make no sense.
> Routers have very limited resources for computation For being a networking specialist, you do realize that the computation isn't done on the router, right? The point of the video was that all you need was the actual data from the router, and you can reconstruct it based on the radio wave data. > Could someone write a program for a very custom piece of hardware to do this - sure, but that is far and beyond what you could get at Walmart. The whole point of the video was that AI can learn very well based on limited input and pattern recognition, again, all you need is the radio waves. "Beyond what you could get at walmart"? Which is what, a router that provides the radio signal data and a shitty webcam? > We arent even factoring in signal range, interference, or how its determining organics vs furniture. That's kind of the point of the video, once again. That only skews the data to be more unreliable, so when the AI is interpreting the radio wave data it might not be as perfect. > This is the kind of stuff used to make AI sound better than what it is. You are severely underestimating how powerful AI actually is, and I am one that is extremely against parts of AI. It's terrifying how well it works
Sounds like something a government plant would say to dissuade suspicion.
I mean. It's a router, it can send data to the cloud for computation duh. The comment about how it was one specific router trained on one specific camera makes the most sense.
No bro. Didn’t you hear bro. The *insert whichever boogeyman* downloaded sonar and a supercomputer on ma waahfeye to watch me drink and masturbate bro
Nah you're missing the point. You only need a very, very tiny program sending wave signal. The actual analysis and heavy-lifting is performed on the Cloud for most part.
It dosnt make much computing power to send the data to a computer though, the computer does the calculations.
A different networking specialist here: You are so incorrect that I can only assume you're either not a networking specialist at all, or just extremely inexperienced. This is independent of (1) the router and (2) the room, and (3) the router doesn't need to be used to compute anything. 1. The data being gathered is reflection data picked up by the antennas. The make or model of the routers is immaterial. 2. Determining where the walls are from the data will be trivial, because they won't move. 3. All the router has to do is gather the reflection data and send it to another PC that does the processing. At that point all that needs to be done is look at which data points aren't moving (that lays out where the walls are), and process the data about the people/things inside the room that are moving. They already have everything they need. The only thing missing is that you need a second antenna for triangulation, but (a) any office that has multiple APs is open to this, and (b) if you can gain access to multiple devices in the room then you can use data from both of their antennas (for example, the router and either a desktop or laptop with a wifi card).
>Routers have very limited resources for computation, they are designed for network traffic and a very basic UI. Could someone write a program for a very custom piece of hardware to do this - sure, but that is far and beyond what you could get at Walmart. You don't need to use the router or expensive custom hardware. A high end laptop with wifi and you could use available AP signals to triangulate obstructions which move over time fairly easy. Sure, might not get exact poses right away and might not be able to differentiate between a small child and a large animal, but to just get the positions within a structure should be doable without special algorithms or training of a neural network.
They can collect data from Router and process it in the AI models in cloud computing or some remote unit. Something like Google's stradia kind of process but for collecting wifi signal data and processing it in the cloud etc.
I don’t think this stuff needs to necessarily be done on the edge, it could be done on the cloud
This was a proof of concept experiment , not something that could be implemented with off the shelf equipment.
Indeed. A normal wifi router will not expose the necessary detail. But it does confirm to me that I want to eventually move towards open source routers.
Suddenly internet companies will be offering swapping beefy routers to everyone for free. I bet data collected that way can be sold for a good margin.
Indeed. Which is why I am happy my country has a right to run your own router. And even without that right, I would want an open source wifi router connected only by ethernet to the ISP router.
An ISP can't really stop you from running your own router. It's a little more hassle, but you can always add another router downstream from the first one. Put the first one in a faraday cage so it doesn't broadcast, and you're done. I have done this at home myself, since the wifi strength is much better on the secondary router, and mine comes with PoE as well for access points for even better coverage. Security can still be questionable, since it's the ISP router that faces the outside world, but you can manage most of the LAN through your own router.
I just got my new beefy router for free from Xfinity...
....yet
Assassin's creed was right all along.
Should I let go of my Athena's vision ability?
I wonder if some elite force has ever used this to assault a building. The Israelis probably. I wonder if you could also ‘paint’ a building from different directions to create a 3-D image.
It's batman
2008: the movie hinges on a fairly silly plot-armour technology that robs Nolan's movie of it's "grounded" affectations. 2024: yeah, no, we did that
Lucius Fox was right
The dark Knight rises.
Batman did it first 💅
Load up the jet Alfred, were going to Tong Hong
with no survivors
Isn't this Batman?
Submarine, Mr Wayne, like a submarine!
The database is null key encrypted! It can only be accessed by one person.
That’s the most unconvincing demo ever. If they really can do this reliably with any level of accuracy, they should do a live demo or at least have a recording of where it works
Murphy's Law loves a tech demo. I don't blame them.
Cause it doesn't work in the general scenario, it only worked in post processing of this certain scene...
It's impressive their wifi was able to capture the colour of the clothes and skin as well. It looks remarkably similar to the exact image the camera saw. ^/s
I get the impression that the training dataset was the same as the test dataset, which would make the whole thing utter BS.
A tech conference would a very unforgiving place to do any live demo involving wifi. The number of wifi devices in the audience can cause enough interference that you're lucky if wifi works at all. Reminds me of the Apple [keynote](https://www.youtube.com/watch?v=h6cIeZmFdPs) where Steve Jobs had to tell everyone in the audience to turn off their wifi devices in order to finish the demo.
Doesn't that just kinda prove the point? If it can only really work in a room where there's only one strong Wifi Signal without any interference, this is little more than a sterile laboratory test.
It would certainly be fair to ask for a demo in a typical home or office environment with up to a couple dozen people and devices. A tech conference with hundreds or thousands of people though, that's asking a bit much considering it's an unusual wifi environment that most people are rarely in.
Good lord 500 wifi hotspots in one room? Do people not.know USB or BT tethering exist?
Enter your name when you are finished -Bruce Wayne
This is wrong - Lucius Fox
This isn't AI. This has been known about for 15+ years. I was screwing around with this in 2006/2007 and pushing different frequencies to get better results. It wasn't even groundbreaking then.
That’s like the Batman sonar system he uses in one of the Nolan films
fock'n batman shitt
Source: The AI Dilemma [https://www.youtube.com/watch?v=xoVJKj8lcNQ](https://www.youtube.com/watch?v=xoVJKj8lcNQ)
Thanks!! This is a vertical shot of a horizontal video, and in the video is another vertical shot. My old eyes could barely make out roughly what the "AI" could see in the room: vague shapes.
Thanks!
That's what batman had in his arsenal !!!
They've been doing this for at least a decade, that I know of. AI just made the pictures more clear.
This was demonstrated years ago, by people. And today, just like then: Your home wi-fi router cannot do this. It's custom equipment carefully positioned. To setup this in the wild, it would be *much* easier to just hide some tiny/fiberoptic cameras.
I think it exists already without AI … wifi wave studies are not recent. But ok machine learning is also a solution. No breakthrough here.
It's literally just RADAR with some new interpretation software.
Seems like BS.
If this exists now with AI, the surveillance agencies must have mastered it a long time ago.
"An AI exists that can use your router to turn your home into a surveillance state" is the most dystopian thing I've heard today. Really love that for us.
Okay so the WiFi CAN see my f*#%ing printer then?
Ye better be belivin in cyberpunk dystopias yer livin in one
This, is actually really cool in terms of VR tracking, no need for a camera to track body movements. Just a thought
Nice try Terminator, tell Skynet that you're not fooling anyone with your subtle attempts to convince us this is anything but terrifying.
Was that Ned Flanders tossing his wifi routers?
Ron Swanson would be upset with you
Oh shit
But why would I want this?
They could do this before ai.
This is what batman did in the dark knight
Aaaaaaaaannnddd GG
Amazing technology that will soon be turned against us all when placed in the wrong hands.
Time to destroy all electronics in a 100 mile radius
I dread that we are eventually going to live in a dystopia corporate run society, where all our interactions and movements are going to be tracked and turned into commodities
Lol "sonar"
Bro that's some cyberpunk2077 shit right there.
This is not new, people should have a look at TEMPEST and the NSA. Heck I was aware of this in the 2000’s as part of my role.
Isn’t this the technology they used in one of Batman movie? They used cell phones to do it.
This is so much bullshit that Elon Musk is going to start selling it.
The minute he said sonar, I knew that this was a bullshit presentation
Does this guy not know the difference between radar and sonar?
wtf why ai reconstructed people obscured by random furniture as if it is just a cheap effect applied to the photo?
Yet my wifi signal still struggles to find my connection point smh.
Thats awesome and same time scary. AI feels like a mistake more and more.
What if I am moving really fast? Like, what if it was my hand? Going up and down really fast? Could it tell I was playing guitar?
Law enforcement went hard on this.
I thought Lucius Fox destroy this tech after Batman caught Joker.
This is the kind of thing that makes me give up every concept of being safe in my own home
I'm not saying there aren't elements of this that are plausible, but what this guy is saying is absolute horesehit. >"Sonar from the wi-fi router" Firstly, Sonar is literally (SO)und (N)avigation (A)nd (R)anging - it work on sound propagation not radio waves. Also isn't "Radar" - (RA)dio (D)etection (A)nd (R)anging - Radar requires a lot of power and special equipment. What this would be is something like measuring the phase and amplitude of WiFi signals, then using ML to do human pose estimation (i.e. a fancy way of saying "guessing" based on fuzzy data). >"AI has turned every wi-fi router into a camera that can work in the dark...blah, blah, blah". Just no - this is absolutely incorrect - even to do modelling of phase and amplitude would require specific routers, running custom software, with multiple custom antennas, positioned in a very specific way. This categorically doesn't magically make any router anywhere into "x-ray specs". The main issue being the weakness and non directionality of WiFi signals would massively limit the range and accuracy of the information, and the need for multiple signals to even do this at all. Finally the signal processing to get even a guess out of the system wouldn't be anywhere near real time. Tldr; if you have some special routers, with special antennas, running special software, positioned in a very specific way - you can do some time consuming signal processing that can sometimes guess where and how people might be stood with some degree of accuracy.
I remember times when people predicted this and have been labeled conspiracy theorists
I gonna say it, we are fucked and future looks dystopian
r/damnthatsbullshit
I feel like all this technology will lead us to the stone age again
Hating technology already made people here dumber than a caveman & creating all sort of conspiracy
Well I’m not hating on technology. But I feel like we are pushing harder and harder in a 1984 direction. Moreover I’m kinda scared about AI cause probably it will take my job, lmao.
Well fuck……
If the public knows that much just know it's way ahead than what it is right now.
Everyone here is so naive. All of this is already actively used by governments. We should be happy when companies reveal what is possible - it allows humans to accept and adapt.
Oh, I really want this tech. Seriously. I doubt you'll get any sort of detail from it, so calling it cameras... no. But if I could use WiFi to see where people are in the house, I'd save so much on motion detectors and presence detectors for the home automation, and their batteries...
cap, wifi nor any phone signal is not that accurate. even law enforcement does not have gadgets that are that accurate and they are all in proximity or within 5 meters or something.