> by preventing wall penetration, reducing jamming
This is the main selling point I think, but not intentional jamming... regardless of the inherent lower latency, lower jitter and higher throughput, unlike wifi none of these aspects are hampered by proximity to adjacent signals from other networks and other EM sources.
Even if people don't want to kit out their entire house a la PoE, it would be a nice benefit to have this work side by side with wifi... bad signal? just walk into the room with the router, auto switch to lifi and it's effectively as good as wired. Also more devices automatically using lifi when wifi is not necessary will alleviate interference for everything and everybody else where it's actually necessary. It's a win win technology.
It's also interesting to see something move (relatively) quickly from experiment to standards proposal. I suppose that's due to the practicality of this tech.
In high density living (condo towers, downtown etc), it is common to have interference from neighbor's wifi network on the same floor or adjacent floors.
Having a network that expands using a non-interfering frequency would be a godsend. Especially given the limited number of channels for existing WiFi standard.
Newb question: since they are both electromagnetic radiation, how does Li-Fi get up to 100x faster than Wi-Fi? Are the light transmitters and sensors that much faster? Or perhaps is it able to use a wider band of frequencies?
I'm also so curious how this ends up working in practice. Even using infrared, would it interfere with things like baby monitors in night mode? The Wikipedia article says it can be tuned to be less intense than humans can perceive, but I'm curious if that's true in practice. (Granted, babies don't typically need Internet access while they are sleeping, but maybe the monitor itself does.)
> the higher the frequency, the higher the theoretical maximum bandwidth.
This is not true and is unrelated to Shannon's theorem.
Shannon's theorem shows us that wider bandwidths allow for larger bit rates. At higher frequencies our bandwidths can be bigger. For example a band from 1 to 2 terahertz is 1 terahertz wide, which is 1000 times larger than a band from 1 to 2 gigahertz (1 gigahertz wide).
The total bandwidth available (including multiple channels) for 2.4 GHz Wi-Fi is about 100 MHz. The total space available for this new standard is 800 to 1000 nm [0], which is 450 THz. That's 4.5 million times wider than Wi-Fi. That is why you get higher bit rates with this new standard, AKA more throughput, or more "bandwidth", when the term is used to mean data rate.
> > the higher the frequency, the higher the theoretical maximum bandwidth.
> This is not true and is unrelated to Shannon's theorem.
You are correct that it is unrelated to Shannon, but it is still true. The higher your carrier frequency the higher your theoretical maximum bandwidth (in the correct meaning, i.e occupied spectrum), you can never have negative frequencies, so modulation the maximum bandwidth you can modulate a 1Hz to is 2 Hz (modulation bandwidth extends to positive and negative frequencies). A 10 Hz carrier can be modulated to 20 Hz...
> Shannon's theorem shows us that wider bandwidths allow for larger bit rates. At higher frequencies our bandwidths can be bigger. For example a band from 1 to 2 terahertz is 1 terahertz wide, which is 1000 times larger than a band from 1 to 2 gigahertz (1 gigahertz wide).
So you are contradicting yourself? Not sure why you said the earlier statement is not treu?
My point was that Shannon's theorem is defined in terms of bandwidth. I think speaking about frequency is misleading, even though it's true when discussing carrier/central frequencies. I shouldn't have said OP's statement was untrue, since it's strictly true as you say, that higher frequencies allow for wider bandwidth. They just don't have to have wider bandwidths, which I was trying to make clear. Thanks for the correction though!
I think this all boils down to the confusion because people use bandwidth and capacity to really mean throughput. Talk about the capacity of a your internet connection to a communication theorist if you want to start a rant.
True, The “band” is a range of frequencies, from lower bound to upper bound.
You can have a single frequency carrier that you modulate, in which case your bandwidth has more to do with your modulation scheme, and the rates implied by that
You can use FM, AM, QAM, or other multi-bit modulation schemes to send that information, but you need to have the signal-to-noise ratio to demodulate it. WiFi actually goes up to QAM-1024 (10 bits per symbol) in the more recent specs. However, the SNR you need to decode that is perfectly is something like 35 DB, while recovering a signal that sends 1 bit at a time needs ~3 DB. A 35 DB SNR is very hard to reach unless the RF environment is quiet (basically impossible in an apartment building, for example), but 3 DB is easy.
Shannon's limit tells you about the total information capacity of a channel given its bandwidth and SNR. This is usually achieved by using deeper modulation than theoretical, and using error-correcting codes to recover the lost data.
Modulated signal could be expressed as sum of signals with different frequencies, but will it be registered by receiver as signal at these frequencies?
Suppose we send 1hz signal with length = 1 hour. In the middle we change amplitude of one wave to 1/2. Does recievers recieves mix of different frequencies?
The only signal that contains only 1 Hz and no other frequencies is a perfect 1 Hz sine wave. As soon as you start modulating the amplitudes away from that sine wave, you're introducing content at higher frequencies. You can use that higher frequency content to transmit information at more than 1 bit per second, but you're not exactly using the 1 Hz signal to transmit information.
SSB still makes other frequencies. Viewed on a waterfall, it's just half of AM, minus the carrier. Still occupies a range of frequencies, approx half of what AM does.
10 bits is at least plausible, but if you want to try to transmit .1 kbps with no frequency content over 1Hz, you're going to need the mother of all SNRs.
Just imagine what we'll be able to do with Gamma data transmissions ;-)
I like to explain to my kids that it's all colors - we are transparent to some (such as X-rays) the same way some fish are mostly transparent to the light we can see and walls are transparent to the radios we use in Wi-Fi, and also that both snakes and bees can see light in colors we can't (snakes see IR, bees see UV).
I think you're misreading Shannon's Law. It's not that the higher the frequency the greater the max bandwidth. It's the higher the bandwidth the more data you can put through.
I think Li-Fi is more comparable to Ethernet instead of Wi-Fi, not because the fundamentals are very different but because of the link budgets available. If you think of an ethernet line as a pipe carrying data, then a collimated laser can be thought of in a similar way - most of the energy that you transmit is going to make it to the destination, and that is not the case with Wi-Fi, even with very high gain antennas. This allows for different modulation schemes and thus higher throughput. Copper ethernet is now capable of 1.6 Tbps [1] and Li-Fi doesn't seem so very fast compared to that; however keep in mind this is comparing only physical layers. Demonstrations of optical laser links of hundreds of Gbps over hundreds of kilometers have taken place [2] using COTS optical ethernet transceivers and special output stages providing precise collimation and pointing.
For your second question, I'm not sure how baby monitors work but the proposed wavelengths are unlicenced and there are little if any rules for how to deal with interference. There are rules for eye safety of laser which limits the maximum energy that can be delivered to the output. Generally as Li-Fi gets more common we will have to learn to deal with interference as it arrives. For example, Lidar systems (older, noncoherent ones) interfere with one another and are even susceptible to interferece from IR motion detectors and such, but these aspects have to be considered during design.
My understanding is they went with 800 to 1000 nm (infrared), or ~ 375 to 300 THz. I'm not sure as to the total combined bandwidth of all of WiFi 6 or 7, but 75,000 GHz band gives them a lot to play with.
> Newb question: since they are both electromagnetic radiation, how does Li-Fi get up to 100x faster than Wi-Fi?
They specified that they wanted to be 100x faster, and they added parallel channels, until they reached that goal. No, really, this is the real reason.
It is entirely nonsensical to ask for a physical reason, because different channels are just different.
well the physical reason is that the band available at the visible light spectrum allows you to add that much channels in parallel. You can't do that at 2.4GHz
> Even using infrared, would it interfere with things like baby monitors in night mode?
If you're referring to the infrared LEDs that illuminate the baby, their light is not polarized, while light used for communication is, so a polarizing filter in the receiver can filter out such noise.
Baby monitors with night vision usually have a small set of tiny infrared lights so that they can see. Adding additional infrared to the room will help the monitor to see better.
Depending on how fast the extra IR light is pulsing, it might end up looking like your baby is sleeping through a rave - which would be entertaining enough to be passed off as a feature rather than a bug.
"Between 3-30 hertz (flashes per second) are the common rates to trigger seizures but this varies from person to person. While some people are sensitive at frequencies up to 60 hertz, sensitivity under 3 hertz is not common"
This came immediately to mind which when comorbid with infrared sensitivity is likely to trigger people without any apparent cause to third party observers....
People forget that the average human barely see jack sht compared to the remarkable exceptions of our species, let alone that such exceptions often have disabling/uncommon conditions comorbid with thier remarkable capabilities.
Correct. The reason why microwaves can cook food isn't the fact that it is a microwave frequency. Microwaves ovens cook by flipping the polarity back and forth. The frequency emitted is the same resonant frequency as water molecules, so the water molecules attempt to align constantly to the ever changing polarity. Movement is heat; thus, the water heats the food.
To set the record straight for the above comment -
- it's true that there isn't a precise frequency needed for microwave ovens to heat food
- however, "polarity flipping" is just a description of electromagnetic radiation itself, and shooting enough EM radiation at food in a frequency range it absorbs will heat it up via dielectric heating
- microwaves have no relationship to any specific resonant frequency of water - the vibration frequencies are orders of magnitude higher https://en.wikipedia.org/wiki/Electromagnetic_absorption_by_... while rotation response inherently does not have a peak
- otherwise, yes, the motion of (polar) molecules induced by an electric field is indeed the mechanism of dielectric heating
On one hand, I see where this could be helpful in certain scenarios like an office, where there is a consistent and planned layout specifically for the purpose of productivity. Or for military applications, where EMSEC is taken very seriously (to the point where Wi-Fi is generally not used at all in most classified facilities). Though I am also not convinced this would change the calculus much there in reality.
On the other hand, I don’t see the draws outweighing what seem to be clear setbacks. E.g. if I put my LiFi enabled phone in my pocket mid download, it will completely cease to work.
What is interesting is the idea of a much more comprehensively connected future. E.g. imagine a building either both Wi-Fi and LiFi enabled, with automatic switching between the two based on which is less congested and provides the best speeds. As our daily bandwidth footprint grows, I can see the benefit in having multiple spectra for information transmission.
I’m not sure that this provides any benefit from an EMSEC perspective, as you’d basically be going from a position where you’re avoiding radios and even cables and similar things that aren’t TEMPEST shielded for fear of emissions leaking sensitive data, to a position where you’re broadcasting your sensitive data over the air, and a listening device simply needs to look at your lights somehow. I will agree that it’s easier to block light than radio, but I think that’s where the advantages end.
I think you have this backwards. You'd still encrypt your transmissions same as if you were using WiFi, so its a wash in terms of security. But it should be much harder to jam your receiver/transmitter, because its point-to-point.
> E.g. if I put my LiFi enabled phone in my pocket mid download, it will completely cease to work.
The light part, sure, but the regular radio wifi part will be fine; it'll be slower, but it won't go away. Ideally there's seamless transition between the networks, LiFi if you have your phone out and there's a sender in the receiver's signal, WiFi in other cases.
Your mobile phone needs to move between cellular towers quite frequently when you are moving around. You don't notice this handoff because it is important to the functioning of the network.
If we decide that moving from Li-Fi to Wi-Fi is important, we can make it seamless.
Right, but I think it's not pretty in our carrier's side. Maybe some more abstraction over where exactly is the data flowing through is needed similar to what Wireguard does.
Right, this is similar to hopping across cellular antennas, someone tries to keep it seamless for you while hopping within their controlled environment, but we still have challenges when your public IP address drastically changes (Wifi->Cellular on phones).
> Of course, Li-Fi isn’t going to sweep away Wi-Fi and 5G alternatives (nor wired networks). Radio waves still have a distinct advantage with regard to transmission through the atmosphere at great distance, and though opaque objects. Instead, work must concentrate on using horses for courses – with Li-Fi advantages being harvested where possible.
>* if I put my LiFi enabled phone in my pocket mid download, it will completely cease to work.*
As per your EMSEC use-case, this is also a privacy benefit, as your pocket becomes a defacto faraday cage guaranteeing that your devices can only transmit information when you want them to.
In your pocket, or just turn around so your body is in between the source and your device? Maybe reflections get the job done, but that has to harm data rates, right? I am not an optical expert.
I've researched LiFi before, but everytime I research it I find expensive commercial equipment, or hobbyists playing with very low data rates on Arduino.
I have many questions, especially how a LiFi receiver works. wouldn't this essentially need to be a high speed camera with very few pixels?
Does anyone have recommendations of a dev kit, or transceivers to play with this? Also, ones that don't cost several thousand dollars?
The technology is mostly in those two camps because it's new, expensive, and niche, like WiFi was 20 years ago. As production increases (starts?), components will drop in cost and become standardized and more ubiquitious. At the moment there is a lot of practical research you can read about from conferences like OFC [1] and SPIE PW at the free-space laser or telecom tracks [2]. At the moment transmitters and receivers are made up of highly specified components for their use case and are very parametrized, for example there are hundreds of different DFB lasers that can be used as sources.
I'm a little surprised that IEEE has already standardized, especially given the wavelength they chose but I imagine their members were forced to adopt a prolific technology as without a standard they risk the technology moving ahead without them.
Basically, the only new principle involved is that instead of de-serialized and subcarrier modulated data modulated onto 2.4GHz sinewave and emitted from antennae as electrical field changes, it is now sent as changes in light level on subcarrier frequency.
Or more simply, maybe it could be done in YouTuber style by a light-emitting diode on Tx antenna port and a photo-sensitive diode on Rx antenna port? Switching speed of Tx side LED could become the limiting factor in that case.
> wouldn't this essentially need to be a high speed camera with very few pixels?
A camera sensor 'pixel' is just a device which conducts proportional to the amount of photons that hit it. A typical digital color camera uses CMOS chips to do this, with a filter on top of them to isolate red, green, and blue. It is pretty basic; the real trick is getting millions of them on a 1/4" sensor and having them relay the data properly with a reasonable amount of noise.
We have other devices that are "just a device which conducts proportional to the amount of photons that hit it" but we do not call all of them "cameras".
A camera is a device which receives light signals and translates those into an image of some format.
A photovoltaic cell in a solar panel is a device which generates electricity proportional to the number of photons that hit it.
A photoresistor is a device which resists current in proportion to the number of photons which hit it.
See now, such a LiFi transceiver would not necessarily be termed a "camera" any more than the infrared sensor in urinals is a camera. I believe that's the way we want it to be, right? LiFi has no use for producing images, only translating light back into network and signaling data. That's not called a "camera" by any means.
> We have other devices that are "just a device which conducts proportional to the amount of photons that hit it" but we do not call all of them "cameras".
But if you took those devices and made an array of them you can make a camera sensor. Ergo, if you take one element of a camera sensor you have a light detector element.
You want to call it a "camera sensor" but I personally would not attach the moniker "camera" unless they are in the business of translating light into images of some kind. Here, let's ask Wikipedia:
"A camera is an optical instrument used to capture and store images or videos, either digitally via an electronic image sensor, or chemically via a light-sensitive material such as photographic film."
See now, a camera is the whole instrument, not merely its image sensor. But a camera uses an "image sensor". What is an image sensor?
"An image sensor or imager is a sensor that detects and conveys information used to form an image. It does so by converting the variable attenuation of light waves (as they pass through or reflect off objects) into signals, small bursts of current that convey the information."
So there is no way we've described what's going on in LiFi. For example, you walk up to a urinal and the infrared sensor detects you. Does it paint an image of your privates on a website? No. There is no camera in the urinal, hopefully. The urinal is only interested in whether you are standing right there or if you've left. The urinal does not employ an "image sensor", it uses something dumber.
Likewise, do your solar panels use cameras? They conduct based on exposure to light, don't they? But what is a solar panel concerned about? It generates electricity, not images. A photovoltaic cell is not an image sensor because it has nothing to do with images.
What would your camera be if I disabled the viewfinder display and eliminated its ability to save files on sdcard? Would it still be a camera if its sensors produced electricity but it couldn't provide me an image based on that conduction?
LiFi is not using something dumber, but LiFi is likewise unconcerned about creating images. Since a camera is, by definition, concerned with images, LiFi does not use cameras.
I think you would benefit from this advice as well; and I did attempt to apply logic, but you're ignoring the quoted Wikipedia definitions, and essentially we're just talking past each other, and I have no idea what sort of terminology you're trying to throw around now because it doesn't evidently have anything to do with LiFi tooling as it is.
I answered a simple question: 'is this essentially a camera with few pixels', and I defined what a camera pixel is and agreed that it is 'essentially a camera with few pixels'. It obviously isn't a camera, just like a bicycle is not a motorcycle, but it is 'essentially a motorcycle with a person as an engine'.
If you still think that is wrong, then that's fine I guess; it's your opinion and you are welcome to it.
I appreciate the agreeable answer, but it's also worth noting that the bicycle came first in its simplicity, and so the correct framing is that a motorcycle is a bicycle with an internal combustion human.
So how did cameras start? Well, the word is literally Latin for "room" because a man would go into a small, darkened room with only a pinhole opening at one end, and he could observe an image projected on the far wall.
So the original "camera obscura" had no lens or sensors at all! It was essentially a refractive element and a screen. The observer could then paint or draw according to the projected image he perceived with the image sensors in his eyes.
Also to a more trivial extent, perhaps the same idea behind the IR ports on Gameboy Colors. Which is a really interesting and probably underutilized feature to think back on now. And even more interesting in that this preceded the first mass consumer devices offering WiFi, which Wikipedia tells me only really took off with Apple’s iBooks in 1999.
Lego Mindstorms bricks could be programmed over an IR connection like this. The earliest ones shipped with an IR transmitter that had a 9-pin serial socket on the back.
IR was great, you could do things like transfer contacts and data between palmtops.
I mean I got a palmtop (a Palm V iirc) for cheap well after they were commonly used and I never used it for anything important, but still, it was a cool device. I think I have it somewhere still, wonder if it still works. I mainly used it to play Sudoku on though.
> My first thought on reading that headline: isn't that single/multimode fiber?
Kind of how radio being a wireless telegram system:
> You see, wire telegraph is a kind of a very, very long cat. You pull his tail in New York and his head is meowing in Los Angeles. Do you understand this? And radio operates exactly the same way: you send signals here, they receive them there. The only difference is that there is no cat.
This was tried with WiGig, and that didn’t solve any problems significant enough that it caught on. People tried to use WiGig for things like wireless docking stations, and I suspect it just wasn’t needed because at such short ranges, most people would want/need to have a power cable connected (which naturally leads to the USBC docking stations we have today, where both power and data go over the same cable).
Wigig wasn't fast enough to be useful for wireless HDMI, especially given 4k displays arrived right at that moment, and 2nd gen WiGig it's probably way to expensive.
The question is whether it's possible to create a full-rate 224 gbps transciever that is cheap enough to appear in mid range phones, notebooks and TVs and whether it would work without line of sight, but in the same room, with reflected light.
The wireless adapter for the HTC Vive headset was based on WiGig. The range is limited to a few meters, but that's more than enough for room-scale VR. The need for massive bandwidth and low latency make it a perfect use case for WiGig.
WiGig just came too early, the underlying tech was also barely able to support it. Windows 7 and Vista era, expensive and the software really wasn't there either.
It might be different now with VR and the rise of docks.
> Of note is that the first commercially available Li-Fi system has been available since 2014, and it hasn't gained any traction?
Probably for the normal early reasons, too big, fiddly, and pricey. Someone must have finally shrunk and/or cheapened it enough for a large segment of enthusiast or business for it to gain more attention again.
reading that, it seems to be about consumption/receiving data rather than the sending of data. or do you have light shining out of your laptop back at it?
there is an example of a school using it, and the students have a usb device in their laptop to read the light, but is it slow upload or are they also connected to wi-fi?
Back in the day, I had a laptop with Infrared Serial Port. Considering WiFi cards weren't not a thing at the time, it was pretty handy to "Air Drop" files to other students.
"LiFi" does seem pretty niche. Anywhere you're okay with a 10-15m range, have full line-of-sight, somehow can't run a wire or fiber run, and don't want regular WiFi for whatever reason.
Could be useful for modular autonomous robotic devices that act as a single organism, like ants or cells. They would each do their own thing but communicate using this (or something else fast, short-range, and error-resilient) so as to work towards a unified goal without a lot of redundancy.
Any place you have this in a point-to-point rig, you could have fiber cables guiding the light and creating a private collision domain, improving overall bandwidth, reliability and security.
The cost of datacenters is not currently constrained by fiber.
Fiber is not free to run. In an open floor plan office with 20 workstations on an overgrown table, you could run 20 fiber pairs (or cat5e cables or cat6a cables), and you could terminate them and connect them, or you could set up one LiFi access point and 20 client devices. The latter is a lot less cable bundle and a lot less labor.
This can be done now with wifi. The question I answered was about datacenters.
There is wifi in datacenters -- usually for the benefit of visiting techs. Not for inside a rack.
Incidentally, if asked to set up 20 workstations on an overgrown table, and it's not a very temporary thing while the office is in turmoil, I recommend finding a different employer.
For you home network, they're are likely few or no use card to warrant a new tech like this.
But for offices, schools, conference halls, and other place with many people together it could be useful as a very fast zero-wire solution. Even then it'll be niche: most office workers or conference attendees don't need 100Mbit let alone more than 1Gbit. But some will: maybe said office is full of video editors and such.
Remember that the bandwidth is a shared resource like WiFi: that isn't "up to 224Gbit/s for every device in the room" it is "up to 224Gbit/s for every device in total". And that "up to" value is for ideal circumstances: there will be some environmental interference (though LOS limitations will work in this techs favour there) and, more significantly, the more devices you have in a given collision domain the more they will interfere slightly with each other too because they can't share the group resource with complete efficiency (and this difficulty grows exponentially after a certain threshold). Get a couple of machines transferring arbitrary data as far as they can via WiFi, take the total of transfer speeds they are getting and now add more machines doing the same. The total might scale well for an irritation or two of this test, but there will be a pint when it significantly won't and the total bandwidth will actually fall (and latency will shoot up).
The main push for 5G mobile networks wasn't better max bandwidth per device, but better performance (latency and throughout) for every device when you have a lot of them sharing the same collision domain. The headline figure of 224G bit is (while not at all dishonest) just that: a figure to gain interest via headlines. Those who actually have a need or want for such tech will be looking much deeper into it than that.
home networks in high density condo towers where WiFi network channels are often already occupied by neighbors on the same and adjacent floors. The 5GHz band might have a "limited distance" but it's still enough to interfere between neighbors.
Good point. I was thinking about the bandwidth within the local network partition and neglected to consider that the line-of-sight nature is helpful in compact housing because it significantly limits what can interfere with that local partition.
It's "Data transmission tests have reached speeds of up to 224 Gbps" which might translate to something like 50 Gbps, or 6 GBytes/s of reliable goodput. Divide by the number of users. Wireless high res VR/AR gear might indeed be one application but also many normal "access my data over the network" applications on the laptop currently need need this class of bandwidth. Note that on normal ethernet you get the 40-100 Gbit/s BW per end user, since it's wired & switched. But this will be shared media.
How do you build devices capable of producing or consuming data at that rate? I looked up the data transfer rates of RAM[0], and this is twice as fast as the fastest species of dual-channel DDR5.
Even if your devices can't produce or consume data nearly that fast, it's still good to have all that bandwidth at the link level because you'll probably be sharing it with multiple devices.
Maybe this would be in some sort of specialised network appliance, bridging the link into a wired network of some kind, not a general-purpose machine buffering the network data in regular RAM?
I think the applications would have to be in data centers where environments are controlled and certain hardware can easily eclipse 224GB/s such as GPU memory.
But I'd assume that if you're in a datacenter, you can use physical wires since you control everything.
> Imagine not having to share bandwidth with your neighbors.
6 GHz Wi-Fi ("Wi-Fi 6E") is going to be a big help with this. Many more channels than 5 GHz, and the higher frequency means it doesn't travel as far. As someone who lives in a high-rise where 5 GHz is already pretty crowded, I'm looking forward to more devices supporting it.
I'll be honest, I don't see Li-Fi taking off for normal consumer use. Few users are going to be interested in arranging their home to have direct line of site between their access and point and game consoles, TV, etc. And there would need to be some very fast and reliable mechanism to handover between Li-Fi and Wi-Fi, so that you don't cut out when you, for example, move your head slightly during a VoIP call.
The line of sight does not need to be direct, light can bounce and still retain enough brightness to communicate. Which is also how 5ghz wifi works because 5ghz wifi also has trouble with going through objects like walls.
The 'light-based' network thing kinda threw me for a second...wifi IS technically light, its just a frequency (RF) at which we can't see it and that can (somewhat) penetrate walls.
Not sure about direct line of sight but the article mentions that it'll never fully replace wifi for the reason you've stated. Radio waves can go much further distances and penetrate walls/buildings etc. Light can't do that.
"In the Fraunhofer HHI video above you can see a Li-Fi system re-using a building’s lighting infrastructure for data. "
Oh cool, so if you want to hack a network, you just have to tap into their power line by climbing a pole or opening their service box. Kind of just like how people would climb telephone poles to make long distance calls.
Sounds like the best use case for this tech is to control a whole host of robots in a building from a central processing unit that means you could have very small robots receiving their commands from a new wave mainframe orchestrating the entire activity of say... a manufacturing facility.
Basic science question here, but aren't radio waves also a type of light, but just ones we can't see? What's the actual intended difference between radio and light in this context?
I wonder what effect this light will have on nonhuman animals, especially bees and mosquitos. If there was a specific frequency of light that repelled mosquitos, I want _that_ to be used by wifi
It probably doesn't, except for a few cases like a video editing station not working with entirely local resources.
But when you have a lot of devices in the same collision domain (like WiFi, and hubbed rather than switched networks of old) this is a shared network leg: get a 50+ devices in the same room pushing data back & forth at various rates and I doubt you'd see anything like the theoretical total bandwidth of 224Gbit/sec for that network leg (and latency will shoot up). Maybe you'll still reliably get 1Gbit/device which you wouldn't with other tech, but you won't see anything like 224/50+Gbit per device. Try doing anything much on shared WiFi at a large conference, or on your phone's non-wifi data capabilities in a large public venue with hundreds of other phones even just idly interacting with the network, and you'll see what I mean.
The speed figure is just a headline figure, and attention grabber, not actually misleading but not at all as meaningful as the headline writers might think. The more important details are how well the tech works when congested, how much how many devices can do before the effective available bandwidth falls through the floor and/or latency figures reach for the moon. That 224Git figure is the upper limit for the whole collision domain (the room, as this is an LoS constrained technology), much like 56Mbit was the upper limit on 802.11g and 11Mbit was the upper limit on 802.11b (how often did you see those rates even on aggregate with more than a couple of devices active at a time?).
I believe we're talking about the equivalent of a small LED, like half your devices already have one or more of, but infrared and probably lower intensity. I.e. a teeny invisible glow. Even for animals that can see the glow, it would low enough intensity to be barely noticable.
How do many insects respond to even small sources of that particular section of the spectrum?
People don't even realize when thier clothes have UV reflecting materials which affect birds (eg hummingbirds) so ofc y'all don't comprehend not accept what I'm saying, you literally don't know about this problem with human materials we have scattered all over which are negatively affecting creatures outcomes as we constantly trigger thier basic natures.
This is the main selling point I think, but not intentional jamming... regardless of the inherent lower latency, lower jitter and higher throughput, unlike wifi none of these aspects are hampered by proximity to adjacent signals from other networks and other EM sources.
Even if people don't want to kit out their entire house a la PoE, it would be a nice benefit to have this work side by side with wifi... bad signal? just walk into the room with the router, auto switch to lifi and it's effectively as good as wired. Also more devices automatically using lifi when wifi is not necessary will alleviate interference for everything and everybody else where it's actually necessary. It's a win win technology.
It's also interesting to see something move (relatively) quickly from experiment to standards proposal. I suppose that's due to the practicality of this tech.