Hacker News new | past | comments | ask | show | jobs | submit login
100x Faster Than Wi-Fi: Light-Based Networking Standard Released (tomshardware.com)
272 points by rbanffy 10 months ago | hide | past | favorite | 154 comments



> by preventing wall penetration, reducing jamming

This is the main selling point I think, but not intentional jamming... regardless of the inherent lower latency, lower jitter and higher throughput, unlike wifi none of these aspects are hampered by proximity to adjacent signals from other networks and other EM sources.

Even if people don't want to kit out their entire house a la PoE, it would be a nice benefit to have this work side by side with wifi... bad signal? just walk into the room with the router, auto switch to lifi and it's effectively as good as wired. Also more devices automatically using lifi when wifi is not necessary will alleviate interference for everything and everybody else where it's actually necessary. It's a win win technology.

It's also interesting to see something move (relatively) quickly from experiment to standards proposal. I suppose that's due to the practicality of this tech.


In high density living (condo towers, downtown etc), it is common to have interference from neighbor's wifi network on the same floor or adjacent floors.

Having a network that expands using a non-interfering frequency would be a godsend. Especially given the limited number of channels for existing WiFi standard.


Newb question: since they are both electromagnetic radiation, how does Li-Fi get up to 100x faster than Wi-Fi? Are the light transmitters and sensors that much faster? Or perhaps is it able to use a wider band of frequencies?

I'm also so curious how this ends up working in practice. Even using infrared, would it interfere with things like baby monitors in night mode? The Wikipedia article says it can be tuned to be less intense than humans can perceive, but I'm curious if that's true in practice. (Granted, babies don't typically need Internet access while they are sleeping, but maybe the monitor itself does.)


I don't know anything about LiFi, but for EM radiation in general, the higher the frequency, the higher the theoretical maximum bandwidth.

https://en.wikipedia.org/wiki/Shannon%E2%80%93Hartley_theore...

Wifi is 2.6 Ghz or 5Ghz, visible red light is 430 terahertz.

Or it sounds like LiFi is just pulsing the lights on an off, in which case the Nyquist Rate that was invented for telegrams is a better analogy

https://en.wikipedia.org/wiki/Nyquist_rate


> the higher the frequency, the higher the theoretical maximum bandwidth.

This is not true and is unrelated to Shannon's theorem.

Shannon's theorem shows us that wider bandwidths allow for larger bit rates. At higher frequencies our bandwidths can be bigger. For example a band from 1 to 2 terahertz is 1 terahertz wide, which is 1000 times larger than a band from 1 to 2 gigahertz (1 gigahertz wide).

The total bandwidth available (including multiple channels) for 2.4 GHz Wi-Fi is about 100 MHz. The total space available for this new standard is 800 to 1000 nm [0], which is 450 THz. That's 4.5 million times wider than Wi-Fi. That is why you get higher bit rates with this new standard, AKA more throughput, or more "bandwidth", when the term is used to mean data rate.

[0]: https://standards.ieee.org/ieee/802.11bb/10823/


> > the higher the frequency, the higher the theoretical maximum bandwidth.

> This is not true and is unrelated to Shannon's theorem.

You are correct that it is unrelated to Shannon, but it is still true. The higher your carrier frequency the higher your theoretical maximum bandwidth (in the correct meaning, i.e occupied spectrum), you can never have negative frequencies, so modulation the maximum bandwidth you can modulate a 1Hz to is 2 Hz (modulation bandwidth extends to positive and negative frequencies). A 10 Hz carrier can be modulated to 20 Hz...

> Shannon's theorem shows us that wider bandwidths allow for larger bit rates. At higher frequencies our bandwidths can be bigger. For example a band from 1 to 2 terahertz is 1 terahertz wide, which is 1000 times larger than a band from 1 to 2 gigahertz (1 gigahertz wide).

So you are contradicting yourself? Not sure why you said the earlier statement is not treu?


My point was that Shannon's theorem is defined in terms of bandwidth. I think speaking about frequency is misleading, even though it's true when discussing carrier/central frequencies. I shouldn't have said OP's statement was untrue, since it's strictly true as you say, that higher frequencies allow for wider bandwidth. They just don't have to have wider bandwidths, which I was trying to make clear. Thanks for the correction though!


I think this all boils down to the confusion because people use bandwidth and capacity to really mean throughput. Talk about the capacity of a your internet connection to a communication theorist if you want to start a rant.


I mean it sort of is true. If you're at 100THz, you can get a bandwidth of 1THz. If you're at 100KHz, you are not going to get a bandwidth of 1THz.


True, The “band” is a range of frequencies, from lower bound to upper bound. You can have a single frequency carrier that you modulate, in which case your bandwidth has more to do with your modulation scheme, and the rates implied by that


Can I ask a noob question?

Suppose we have 1Hz signal, what stopping us from sending/receiving 10 or 100 bits of info every second by modulationg amplitude of signal?


You can use FM, AM, QAM, or other multi-bit modulation schemes to send that information, but you need to have the signal-to-noise ratio to demodulate it. WiFi actually goes up to QAM-1024 (10 bits per symbol) in the more recent specs. However, the SNR you need to decode that is perfectly is something like 35 DB, while recovering a signal that sends 1 bit at a time needs ~3 DB. A 35 DB SNR is very hard to reach unless the RF environment is quiet (basically impossible in an apartment building, for example), but 3 DB is easy.

Shannon's limit tells you about the total information capacity of a channel given its bandwidth and SNR. This is usually achieved by using deeper modulation than theoretical, and using error-correcting codes to recover the lost data.


When you modulate that 1 Hz signal you're generating power at frequencies other than your 1 Hz carrier.

See https://en.wikipedia.org/wiki/Amplitude_modulation#Spectrum


Sorry, its not clear for me.

Modulated signal could be expressed as sum of signals with different frequencies, but will it be registered by receiver as signal at these frequencies?

Suppose we send 1hz signal with length = 1 hour. In the middle we change amplitude of one wave to 1/2. Does recievers recieves mix of different frequencies?


The only signal that contains only 1 Hz and no other frequencies is a perfect 1 Hz sine wave. As soon as you start modulating the amplitudes away from that sine wave, you're introducing content at higher frequencies. You can use that higher frequency content to transmit information at more than 1 bit per second, but you're not exactly using the 1 Hz signal to transmit information.


https://en.wikipedia.org/wiki/Single-sideband_modulation

No need for higher frequency content. But SNR will have to be good enough.


SSB still makes other frequencies. Viewed on a waterfall, it's just half of AM, minus the carrier. Still occupies a range of frequencies, approx half of what AM does.


No higher frequency content above the carrier.


10 bits is at least plausible, but if you want to try to transmit .1 kbps with no frequency content over 1Hz, you're going to need the mother of all SNRs.


Just imagine what we'll be able to do with Gamma data transmissions ;-)

I like to explain to my kids that it's all colors - we are transparent to some (such as X-rays) the same way some fish are mostly transparent to the light we can see and walls are transparent to the radios we use in Wi-Fi, and also that both snakes and bees can see light in colors we can't (snakes see IR, bees see UV).


Face melting speed.


I think you're misreading Shannon's Law. It's not that the higher the frequency the greater the max bandwidth. It's the higher the bandwidth the more data you can put through.


I am agreeing with you and providing more info:

https://en.wikipedia.org/wiki/Shannon%E2%80%93Hartley_theore...

Maximum channel capacity depends on signal bandwidth and SNR.


I think Li-Fi is more comparable to Ethernet instead of Wi-Fi, not because the fundamentals are very different but because of the link budgets available. If you think of an ethernet line as a pipe carrying data, then a collimated laser can be thought of in a similar way - most of the energy that you transmit is going to make it to the destination, and that is not the case with Wi-Fi, even with very high gain antennas. This allows for different modulation schemes and thus higher throughput. Copper ethernet is now capable of 1.6 Tbps [1] and Li-Fi doesn't seem so very fast compared to that; however keep in mind this is comparing only physical layers. Demonstrations of optical laser links of hundreds of Gbps over hundreds of kilometers have taken place [2] using COTS optical ethernet transceivers and special output stages providing precise collimation and pointing.

[1] https://en.wikipedia.org/wiki/Ethernet_physical_layer#1.6_Tb...

[2] https://ntrs.nasa.gov/api/citations/20210026855/downloads/sp...

For your second question, I'm not sure how baby monitors work but the proposed wavelengths are unlicenced and there are little if any rules for how to deal with interference. There are rules for eye safety of laser which limits the maximum energy that can be delivered to the output. Generally as Li-Fi gets more common we will have to learn to deal with interference as it arrives. For example, Lidar systems (older, noncoherent ones) interfere with one another and are even susceptible to interferece from IR motion detectors and such, but these aspects have to be considered during design.


My understanding is they went with 800 to 1000 nm (infrared), or ~ 375 to 300 THz. I'm not sure as to the total combined bandwidth of all of WiFi 6 or 7, but 75,000 GHz band gives them a lot to play with.


> Newb question: since they are both electromagnetic radiation, how does Li-Fi get up to 100x faster than Wi-Fi?

They specified that they wanted to be 100x faster, and they added parallel channels, until they reached that goal. No, really, this is the real reason.

It is entirely nonsensical to ask for a physical reason, because different channels are just different.


well the physical reason is that the band available at the visible light spectrum allows you to add that much channels in parallel. You can't do that at 2.4GHz


> Even using infrared, would it interfere with things like baby monitors in night mode?

If you're referring to the infrared LEDs that illuminate the baby, their light is not polarized, while light used for communication is, so a polarizing filter in the receiver can filter out such noise.


Baby monitors with night vision usually have a small set of tiny infrared lights so that they can see. Adding additional infrared to the room will help the monitor to see better.


Given how they are calibrated I don't think so.... More likely the image would be getting constantly washed out, right?


Depending on how fast the extra IR light is pulsing, it might end up looking like your baby is sleeping through a rave - which would be entertaining enough to be passed off as a feature rather than a bug.


"Between 3-30 hertz (flashes per second) are the common rates to trigger seizures but this varies from person to person. While some people are sensitive at frequencies up to 60 hertz, sensitivity under 3 hertz is not common"

https://epilepsysociety.org.uk/about-epilepsy/epileptic-seiz...

This came immediately to mind which when comorbid with infrared sensitivity is likely to trigger people without any apparent cause to third party observers....

People forget that the average human barely see jack sht compared to the remarkable exceptions of our species, let alone that such exceptions often have disabling/uncommon conditions comorbid with thier remarkable capabilities.


It's likely going to be super fast.


You can encode a lot more information in higher frequencies.

But each frequency has different things that are opaque to it, and travel different distances before dropping off.

Wi-Fi is already “really low frequency, really low intensity infrared light”, so I suspect the article is correct when it says we won’t notice it.


> Wi-Fi is already “really low frequency, really low intensity infrared light”

No, infrared by definition starts around 300 GHz and goes up from there.


Sorry. I was just going by the literal reading of the infra (under) red.


Correct. The reason why microwaves can cook food isn't the fact that it is a microwave frequency. Microwaves ovens cook by flipping the polarity back and forth. The frequency emitted is the same resonant frequency as water molecules, so the water molecules attempt to align constantly to the ever changing polarity. Movement is heat; thus, the water heats the food.


To set the record straight for the above comment -

- it's true that there isn't a precise frequency needed for microwave ovens to heat food

- however, "polarity flipping" is just a description of electromagnetic radiation itself, and shooting enough EM radiation at food in a frequency range it absorbs will heat it up via dielectric heating

- microwaves have no relationship to any specific resonant frequency of water - the vibration frequencies are orders of magnitude higher https://en.wikipedia.org/wiki/Electromagnetic_absorption_by_... while rotation response inherently does not have a peak

- otherwise, yes, the motion of (polar) molecules induced by an electric field is indeed the mechanism of dielectric heating


I've heard the resonance with water explanation is a common misconception: https://physics.stackexchange.com/questions/150128/how-do-mi...


more of an oversimplification. polar bonds interact with EM field of microwave oven.

water is one of many thousands of compounds having polar bonds


That doesn't sound like resonance to me, which is about constructive interference and natural frequencies. So more of an inaccuracy I think.


On one hand, I see where this could be helpful in certain scenarios like an office, where there is a consistent and planned layout specifically for the purpose of productivity. Or for military applications, where EMSEC is taken very seriously (to the point where Wi-Fi is generally not used at all in most classified facilities). Though I am also not convinced this would change the calculus much there in reality.

On the other hand, I don’t see the draws outweighing what seem to be clear setbacks. E.g. if I put my LiFi enabled phone in my pocket mid download, it will completely cease to work.

What is interesting is the idea of a much more comprehensively connected future. E.g. imagine a building either both Wi-Fi and LiFi enabled, with automatic switching between the two based on which is less congested and provides the best speeds. As our daily bandwidth footprint grows, I can see the benefit in having multiple spectra for information transmission.


I’m not sure that this provides any benefit from an EMSEC perspective, as you’d basically be going from a position where you’re avoiding radios and even cables and similar things that aren’t TEMPEST shielded for fear of emissions leaking sensitive data, to a position where you’re broadcasting your sensitive data over the air, and a listening device simply needs to look at your lights somehow. I will agree that it’s easier to block light than radio, but I think that’s where the advantages end.


I think you have this backwards. You'd still encrypt your transmissions same as if you were using WiFi, so its a wash in terms of security. But it should be much harder to jam your receiver/transmitter, because its point-to-point.


> E.g. if I put my LiFi enabled phone in my pocket mid download, it will completely cease to work.

The light part, sure, but the regular radio wifi part will be fine; it'll be slower, but it won't go away. Ideally there's seamless transition between the networks, LiFi if you have your phone out and there's a sender in the receiver's signal, WiFi in other cases.


I think it'll work as well as moving from ethernet to wifi, or from wifi to cellular, which is, it doesn't really work seamlessly and it's quirky :(


Your mobile phone needs to move between cellular towers quite frequently when you are moving around. You don't notice this handoff because it is important to the functioning of the network.

If we decide that moving from Li-Fi to Wi-Fi is important, we can make it seamless.


Right, but I think it's not pretty in our carrier's side. Maybe some more abstraction over where exactly is the data flowing through is needed similar to what Wireguard does.


Moving between wifi access points is already completely seamless in a modern mesh wifi network.


Right, this is similar to hopping across cellular antennas, someone tries to keep it seamless for you while hopping within their controlled environment, but we still have challenges when your public IP address drastically changes (Wifi->Cellular on phones).


With more demand hopefully they will fix that and they'll have sub millisecond handoff or something.


This is mentioned in the article

> Of course, Li-Fi isn’t going to sweep away Wi-Fi and 5G alternatives (nor wired networks). Radio waves still have a distinct advantage with regard to transmission through the atmosphere at great distance, and though opaque objects. Instead, work must concentrate on using horses for courses – with Li-Fi advantages being harvested where possible.


>* if I put my LiFi enabled phone in my pocket mid download, it will completely cease to work.*

As per your EMSEC use-case, this is also a privacy benefit, as your pocket becomes a defacto faraday cage guaranteeing that your devices can only transmit information when you want them to.


In your pocket, or just turn around so your body is in between the source and your device? Maybe reflections get the job done, but that has to harm data rates, right? I am not an optical expert.


I've researched LiFi before, but everytime I research it I find expensive commercial equipment, or hobbyists playing with very low data rates on Arduino.

I have many questions, especially how a LiFi receiver works. wouldn't this essentially need to be a high speed camera with very few pixels?

Does anyone have recommendations of a dev kit, or transceivers to play with this? Also, ones that don't cost several thousand dollars?


The technology is mostly in those two camps because it's new, expensive, and niche, like WiFi was 20 years ago. As production increases (starts?), components will drop in cost and become standardized and more ubiquitious. At the moment there is a lot of practical research you can read about from conferences like OFC [1] and SPIE PW at the free-space laser or telecom tracks [2]. At the moment transmitters and receivers are made up of highly specified components for their use case and are very parametrized, for example there are hundreds of different DFB lasers that can be used as sources.

I'm a little surprised that IEEE has already standardized, especially given the wavelength they chose but I imagine their members were forced to adopt a prolific technology as without a standard they risk the technology moving ahead without them.

[1] https://www.ofcconference.org/en-us/home/about/archive/ [2] https://spie.org/Publications/Proceedings/Volume/12413?&orig...


Basically, the only new principle involved is that instead of de-serialized and subcarrier modulated data modulated onto 2.4GHz sinewave and emitted from antennae as electrical field changes, it is now sent as changes in light level on subcarrier frequency.

Or more simply, maybe it could be done in YouTuber style by a light-emitting diode on Tx antenna port and a photo-sensitive diode on Rx antenna port? Switching speed of Tx side LED could become the limiting factor in that case.


>Basically, the only new principle involved...

But you forgot the most important aspect: Is side-fumbling effectively prevented?


> wouldn't this essentially need to be a high speed camera with very few pixels?

A camera sensor 'pixel' is just a device which conducts proportional to the amount of photons that hit it. A typical digital color camera uses CMOS chips to do this, with a filter on top of them to isolate red, green, and blue. It is pretty basic; the real trick is getting millions of them on a 1/4" sensor and having them relay the data properly with a reasonable amount of noise.

So, yes.


We have other devices that are "just a device which conducts proportional to the amount of photons that hit it" but we do not call all of them "cameras".

A camera is a device which receives light signals and translates those into an image of some format.

A photovoltaic cell in a solar panel is a device which generates electricity proportional to the number of photons that hit it.

A photoresistor is a device which resists current in proportion to the number of photons which hit it.

See now, such a LiFi transceiver would not necessarily be termed a "camera" any more than the infrared sensor in urinals is a camera. I believe that's the way we want it to be, right? LiFi has no use for producing images, only translating light back into network and signaling data. That's not called a "camera" by any means.


> We have other devices that are "just a device which conducts proportional to the amount of photons that hit it" but we do not call all of them "cameras".

But if you took those devices and made an array of them you can make a camera sensor. Ergo, if you take one element of a camera sensor you have a light detector element.


You want to call it a "camera sensor" but I personally would not attach the moniker "camera" unless they are in the business of translating light into images of some kind. Here, let's ask Wikipedia:

"A camera is an optical instrument used to capture and store images or videos, either digitally via an electronic image sensor, or chemically via a light-sensitive material such as photographic film."

See now, a camera is the whole instrument, not merely its image sensor. But a camera uses an "image sensor". What is an image sensor?

"An image sensor or imager is a sensor that detects and conveys information used to form an image. It does so by converting the variable attenuation of light waves (as they pass through or reflect off objects) into signals, small bursts of current that convey the information."

So there is no way we've described what's going on in LiFi. For example, you walk up to a urinal and the infrared sensor detects you. Does it paint an image of your privates on a website? No. There is no camera in the urinal, hopefully. The urinal is only interested in whether you are standing right there or if you've left. The urinal does not employ an "image sensor", it uses something dumber.

Likewise, do your solar panels use cameras? They conduct based on exposure to light, don't they? But what is a solar panel concerned about? It generates electricity, not images. A photovoltaic cell is not an image sensor because it has nothing to do with images.

What would your camera be if I disabled the viewfinder display and eliminated its ability to save files on sdcard? Would it still be a camera if its sensors produced electricity but it couldn't provide me an image based on that conduction?

LiFi is not using something dumber, but LiFi is likewise unconcerned about creating images. Since a camera is, by definition, concerned with images, LiFi does not use cameras.


It isn't a 'camera' but what is 'essentially a camera with very few pixels' but a small array of light sensor elements?

I think you should take a step back and look at this logically and stop trying to be right.


> stop trying to be right.

I think you would benefit from this advice as well; and I did attempt to apply logic, but you're ignoring the quoted Wikipedia definitions, and essentially we're just talking past each other, and I have no idea what sort of terminology you're trying to throw around now because it doesn't evidently have anything to do with LiFi tooling as it is.


I answered a simple question: 'is this essentially a camera with few pixels', and I defined what a camera pixel is and agreed that it is 'essentially a camera with few pixels'. It obviously isn't a camera, just like a bicycle is not a motorcycle, but it is 'essentially a motorcycle with a person as an engine'.

If you still think that is wrong, then that's fine I guess; it's your opinion and you are welcome to it.


I think the words being look for here are analogy rather than essential.

The essence of a bicycle is not a motorcycle with a human motor. However, that is a very good analogy for what a bicycle is.


I appreciate the agreeable answer, but it's also worth noting that the bicycle came first in its simplicity, and so the correct framing is that a motorcycle is a bicycle with an internal combustion human.

So how did cameras start? Well, the word is literally Latin for "room" because a man would go into a small, darkened room with only a pinhole opening at one end, and he could observe an image projected on the far wall.

So the original "camera obscura" had no lens or sensors at all! It was essentially a refractive element and a screen. The observer could then paint or draw according to the projected image he perceived with the image sensors in his eyes.


> but LiFi is likewise unconcerned about creating images.

Off topic, but I sense great pun potential here.


Reminds me of the trend in the mid-2000s of laptops having IrDA ports. A little bit annoying to use for portable devices.

Definitely has a niche application in areas where RFI is to be minimized.


Also to a more trivial extent, perhaps the same idea behind the IR ports on Gameboy Colors. Which is a really interesting and probably underutilized feature to think back on now. And even more interesting in that this preceded the first mass consumer devices offering WiFi, which Wikipedia tells me only really took off with Apple’s iBooks in 1999.


I immediately thought of https://en.wikipedia.org/wiki/RONJA


Well I remember syncing my Palm Pilot via IrDA (but it was connected to the serial port, I just had a tower PC)


Lego Mindstorms bricks could be programmed over an IR connection like this. The earliest ones shipped with an IR transmitter that had a 9-pin serial socket on the back.


IR was great, you could do things like transfer contacts and data between palmtops.

I mean I got a palmtop (a Palm V iirc) for cheap well after they were commonly used and I never used it for anything important, but still, it was a cool device. I think I have it somewhere still, wonder if it still works. I mainly used it to play Sudoku on though.


> 100x Faster Than Wi-Fi: Light-Based Networking Standard Released

My first thought on reading that headline: isn't that single/multimode fiber?


> My first thought on reading that headline: isn't that single/multimode fiber?

Kind of how radio being a wireless telegram system:

> You see, wire telegraph is a kind of a very, very long cat. You pull his tail in New York and his head is meowing in Los Angeles. Do you understand this? And radio operates exactly the same way: you send signals here, they receive them there. The only difference is that there is no cat.

* https://quoteinvestigator.com/2012/02/24/telegraph-cat/

This time the tail is fibre and not copper, but without the cat tail.


This was tried with WiGig, and that didn’t solve any problems significant enough that it caught on. People tried to use WiGig for things like wireless docking stations, and I suspect it just wasn’t needed because at such short ranges, most people would want/need to have a power cable connected (which naturally leads to the USBC docking stations we have today, where both power and data go over the same cable).


Wigig wasn't fast enough to be useful for wireless HDMI, especially given 4k displays arrived right at that moment, and 2nd gen WiGig it's probably way to expensive.

The question is whether it's possible to create a full-rate 224 gbps transciever that is cheap enough to appear in mid range phones, notebooks and TVs and whether it would work without line of sight, but in the same room, with reflected light.


The wireless adapter for the HTC Vive headset was based on WiGig. The range is limited to a few meters, but that's more than enough for room-scale VR. The need for massive bandwidth and low latency make it a perfect use case for WiGig.


WiGig just came too early, the underlying tech was also barely able to support it. Windows 7 and Vista era, expensive and the software really wasn't there either.

It might be different now with VR and the rise of docks.


https://en.wikipedia.org/wiki/Li-Fi

For those like me confused about how this would even work.


Clicked into the comments hoping such a comment would be here.

Of note is that the first commercially available Li-Fi system has been available since 2014, and it hasn't gained any traction?


> Of note is that the first commercially available Li-Fi system has been available since 2014, and it hasn't gained any traction?

Probably for the normal early reasons, too big, fiddly, and pricey. Someone must have finally shrunk and/or cheapened it enough for a large segment of enthusiast or business for it to gain more attention again.


reading that, it seems to be about consumption/receiving data rather than the sending of data. or do you have light shining out of your laptop back at it?

there is an example of a school using it, and the students have a usb device in their laptop to read the light, but is it slow upload or are they also connected to wi-fi?


Back in the day, I had a laptop with Infrared Serial Port. Considering WiFi cards weren't not a thing at the time, it was pretty handy to "Air Drop" files to other students.


I recall seeing a LiFi demonstration video around 2012-2013 and was wondering what ever happened to it. (source: https://lifi.co/lifi-videos/harald-haas-ted-talk-2011/)


What is the use case today for 224 GB/s wifi? 8k+ video? Genuinely curious.


"LiFi" does seem pretty niche. Anywhere you're okay with a 10-15m range, have full line-of-sight, somehow can't run a wire or fiber run, and don't want regular WiFi for whatever reason.


Only thing I can think of is full fat uncompressed VR streaming to a wireless headset.


Could be useful for modular autonomous robotic devices that act as a single organism, like ants or cells. They would each do their own thing but communicate using this (or something else fast, short-range, and error-resilient) so as to work towards a unified goal without a lot of redundancy.


I can think of two uses:

Open floor plan offices. (Sigh.)

Datacenters or server rooms. This could give quite nice data rates within a rack.


Any place you have this in a point-to-point rig, you could have fiber cables guiding the light and creating a private collision domain, improving overall bandwidth, reliability and security.

The cost of datacenters is not currently constrained by fiber.


Fiber is not free to run. In an open floor plan office with 20 workstations on an overgrown table, you could run 20 fiber pairs (or cat5e cables or cat6a cables), and you could terminate them and connect them, or you could set up one LiFi access point and 20 client devices. The latter is a lot less cable bundle and a lot less labor.


This can be done now with wifi. The question I answered was about datacenters.

There is wifi in datacenters -- usually for the benefit of visiting techs. Not for inside a rack.

Incidentally, if asked to set up 20 workstations on an overgrown table, and it's not a very temporary thing while the office is in turmoil, I recommend finding a different employer.


That is like saying, "handing a 2tb hard drive to someone is 1500x times the download speed of wifi".


"Never underestimate the bandwidth of a station wagon full of tapes hurtling down the highway."

https://archive.org/details/computernetworks02tane/page/56/m...


Maybe this is the solution to my niche issue.

Better network speeds in a room surrounded by Faraday cages with MR scanners in them.

Optical is an option but a right pita to run without holing the cage or circumnavigating it.


Niche also means it will be expensive.


For you home network, they're are likely few or no use card to warrant a new tech like this.

But for offices, schools, conference halls, and other place with many people together it could be useful as a very fast zero-wire solution. Even then it'll be niche: most office workers or conference attendees don't need 100Mbit let alone more than 1Gbit. But some will: maybe said office is full of video editors and such.

Remember that the bandwidth is a shared resource like WiFi: that isn't "up to 224Gbit/s for every device in the room" it is "up to 224Gbit/s for every device in total". And that "up to" value is for ideal circumstances: there will be some environmental interference (though LOS limitations will work in this techs favour there) and, more significantly, the more devices you have in a given collision domain the more they will interfere slightly with each other too because they can't share the group resource with complete efficiency (and this difficulty grows exponentially after a certain threshold). Get a couple of machines transferring arbitrary data as far as they can via WiFi, take the total of transfer speeds they are getting and now add more machines doing the same. The total might scale well for an irritation or two of this test, but there will be a pint when it significantly won't and the total bandwidth will actually fall (and latency will shoot up).

The main push for 5G mobile networks wasn't better max bandwidth per device, but better performance (latency and throughout) for every device when you have a lot of them sharing the same collision domain. The headline figure of 224G bit is (while not at all dishonest) just that: a figure to gain interest via headlines. Those who actually have a need or want for such tech will be looking much deeper into it than that.


home networks in high density condo towers where WiFi network channels are often already occupied by neighbors on the same and adjacent floors. The 5GHz band might have a "limited distance" but it's still enough to interfere between neighbors.


Good point. I was thinking about the bandwidth within the local network partition and neglected to consider that the line-of-sight nature is helpful in compact housing because it significantly limits what can interfere with that local partition.


high fidelity AR/VR without cabling


It's "Data transmission tests have reached speeds of up to 224 Gbps" which might translate to something like 50 Gbps, or 6 GBytes/s of reliable goodput. Divide by the number of users. Wireless high res VR/AR gear might indeed be one application but also many normal "access my data over the network" applications on the laptop currently need need this class of bandwidth. Note that on normal ethernet you get the 40-100 Gbit/s BW per end user, since it's wired & switched. But this will be shared media.


Synchronizing LLMs, streaming AR textures, downloading Linux ISOs. If you give someone bandwidth, they will fill it.


The use case can be found here: https://lifi.co/lifi-applications/

I can only think of useful for conference where everywhere there's light.


10x 22.4GB/s download from CDN or 1x 112GB/s Li-Fi to Li-Fi local transfer before inevitable massive derating


Niche: video editing, interacting with AI/ML training clusters, ...?


iPhones that still have USB 2.0 ports /s


Wireless hdmi, 4k webcams would be nice.


You can do 4K with way less that a hundred megabits, some PoE cameras don’t even put a gigabit port anymore since it doesn’t need it.


Only with compression. Uncompressed 4k60 is 12.54 Gbit/s, so high data rates are definitely useful for wireless HDMI.


4K can mean a lot of things but some of the low bitrate iterations look fairly average.


Yeah but OP mentioned “webcams”


> This is exactly the approach we detail below and in our paper: we modify libjpeg to output DCT coefficients directly to TensorFlow

Why not just implement 2D DCT in TensorFlow or PyTorch or whatever you use, and keep the entire pipeline in GPU memory?

A quick glance at the TensorFlow reference shows that it already has tf.signal.fft2d. I suppose you could just implement tf.signal.dct2d similarly.

(Disclaimer: I haven't used TensorFlow in a while, I've been doing everything in PyTorch of late.)


> speeds as fast as 224 GB/s

How do you build devices capable of producing or consuming data at that rate? I looked up the data transfer rates of RAM[0], and this is twice as fast as the fastest species of dual-channel DDR5.

[0] https://www.softwareok.eu/?seite=faq-This-and-That-or-Other&...


From rough memory, network switches often have total switch bandwidth figures measured in terabits per second.

So "where several other devices intersect" seems like reasonable first thought for where large speeds are needed.


Modern GPUs routinely exceed 5 Tbps in memory bandwidth. (GDDR6X at 20 Gbps per pin, times 256-bit bus width is 5120 Gbps.)

The original article unfortunately quotes incorrect units: 224 Gbps (gigabits per second) is NOT the same as 224 GBps (gigabytes per second).


Even if your devices can't produce or consume data nearly that fast, it's still good to have all that bandwidth at the link level because you'll probably be sharing it with multiple devices.


Doesn’t that leave the same problem - the thing linked with multiple devices can’t take that data rate?

Or are you saying that A > B might use half that data rate while X > Y uses the other half?


Maybe this would be in some sort of specialised network appliance, bridging the link into a wired network of some kind, not a general-purpose machine buffering the network data in regular RAM?


I think the applications would have to be in data centers where environments are controlled and certain hardware can easily eclipse 224GB/s such as GPU memory.

But I'd assume that if you're in a datacenter, you can use physical wires since you control everything.


Isn't this require a "clear line of sight?" Wi-fi works across walls, around corners and so on. Is this possible with this li-fi?


> Isn't this require a "clear line of sight?"

It's a very nice feature. Imagine not having to share bandwidth with your neighbors.

> Wi-fi works across walls, around corners and so on. Is this possible with this li-fi?

Think of this as a wireless ethernet cable. Instead of having a jack in the wall, you have a light fixture in the room.


> Imagine not having to share bandwidth with your neighbors.

6 GHz Wi-Fi ("Wi-Fi 6E") is going to be a big help with this. Many more channels than 5 GHz, and the higher frequency means it doesn't travel as far. As someone who lives in a high-rise where 5 GHz is already pretty crowded, I'm looking forward to more devices supporting it.

I'll be honest, I don't see Li-Fi taking off for normal consumer use. Few users are going to be interested in arranging their home to have direct line of site between their access and point and game consoles, TV, etc. And there would need to be some very fast and reliable mechanism to handover between Li-Fi and Wi-Fi, so that you don't cut out when you, for example, move your head slightly during a VoIP call.


The line of sight does not need to be direct, light can bounce and still retain enough brightness to communicate. Which is also how 5ghz wifi works because 5ghz wifi also has trouble with going through objects like walls.


The 'light-based' network thing kinda threw me for a second...wifi IS technically light, its just a frequency (RF) at which we can't see it and that can (somewhat) penetrate walls.


Light is only the visible part of the electromagnetic spectrum.


Not sure about direct line of sight but the article mentions that it'll never fully replace wifi for the reason you've stated. Radio waves can go much further distances and penetrate walls/buildings etc. Light can't do that.


"In the Fraunhofer HHI video above you can see a Li-Fi system re-using a building’s lighting infrastructure for data. "

Oh cool, so if you want to hack a network, you just have to tap into their power line by climbing a pole or opening their service box. Kind of just like how people would climb telephone poles to make long distance calls.


Presumably it will be encrypted.


Sounds like the best use case for this tech is to control a whole host of robots in a building from a central processing unit that means you could have very small robots receiving their commands from a new wave mainframe orchestrating the entire activity of say... a manufacturing facility.


Basic science question here, but aren't radio waves also a type of light, but just ones we can't see? What's the actual intended difference between radio and light in this context?


I wonder what effect this light will have on nonhuman animals, especially bees and mosquitos. If there was a specific frequency of light that repelled mosquitos, I want _that_ to be used by wifi


Will lifi work with a lot reflective surfaces?

e: over post limit

I assumed it would need LOS. I'm wondering if light bounces off reflective surfaces like mirrors would meaningfully degrade the signal.


Are you asking if direct line of sight is needed? I’m wondering as well


> Light’s line-of-sight propagation enhances security

I am not convinced here, especially that picture that data doesn't leave the place. Eavesdropping thru windows is a thing.


An enhancement doesn't imply perfection. For instance, this would make wardriving far more difficult.


I think the wardriving could stay, they will switch from antena to very sensitive cameras ;)


In high security environments windowless room are specified. Like a SCIF.


Guess we're not turning the lights off


Whatever happened to 802.11ad? (or even 802.11ay for that matter) That's also practically line-of-sight.


So, basically fiber optic technology but without either the fiber or the focusing lenses? Neat.


2.4 GHz receives noise from a microwave oven. I wonder if infrared receives noise from a gas stove.


> Uses IR

> can disrupt my wife's livestream by pointing the TV remote and it and jamming random buttons


The best application for this would be to implement high bandwidth NFC.


Why does any end user machine need more than a GB/sec?


It probably doesn't, except for a few cases like a video editing station not working with entirely local resources.

But when you have a lot of devices in the same collision domain (like WiFi, and hubbed rather than switched networks of old) this is a shared network leg: get a 50+ devices in the same room pushing data back & forth at various rates and I doubt you'd see anything like the theoretical total bandwidth of 224Gbit/sec for that network leg (and latency will shoot up). Maybe you'll still reliably get 1Gbit/device which you wouldn't with other tech, but you won't see anything like 224/50+Gbit per device. Try doing anything much on shared WiFi at a large conference, or on your phone's non-wifi data capabilities in a large public venue with hundreds of other phones even just idly interacting with the network, and you'll see what I mean.

The speed figure is just a headline figure, and attention grabber, not actually misleading but not at all as meaningful as the headline writers might think. The more important details are how well the tech works when congested, how much how many devices can do before the effective available bandwidth falls through the floor and/or latency figures reach for the moon. That 224Git figure is the upper limit for the whole collision domain (the room, as this is an LoS constrained technology), much like 56Mbit was the upper limit on 802.11g and 11Mbit was the upper limit on 802.11b (how often did you see those rates even on aggregate with more than a couple of devices active at a time?).


I’m curious to see what VR/AR applications this might have.

Imagine peripheral devices capable of much more than they are today merely because of more bandwidth.

Imagine how useful it would be to have a wireless NAS capable of acting as an iSCSI target without having to worry about running cables.

And as “spatial” computing use cases increase along with extremely high res formats like ProRes, we’ll quickly find ways to use the extra bandwidth.


The same reason end users want more than 640k of memory.


> 802.11bb

Chiral ghost Internet?


This is stupid....

Just because the average human can't see it doesn't mean it won't affect any humans or animals

Tons of animals see infrared and this Will have negative impacts on environs it's used in as a result.

Theres even some humans who actually can see partly into Infrared/UV and this is just evil to do to them.

I call this a moronic suggestion.


I believe we're talking about the equivalent of a small LED, like half your devices already have one or more of, but infrared and probably lower intensity. I.e. a teeny invisible glow. Even for animals that can see the glow, it would low enough intensity to be barely noticable.


How do many insects respond to even small sources of that particular section of the spectrum?

People don't even realize when thier clothes have UV reflecting materials which affect birds (eg hummingbirds) so ofc y'all don't comprehend not accept what I'm saying, you literally don't know about this problem with human materials we have scattered all over which are negatively affecting creatures outcomes as we constantly trigger thier basic natures.


If it is used as a light bulb, I am sure someone will be able to see it. Some light bulbs flicker. And it causes headache.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: