Discussion in 'hardware' started by JRViejo, Apr 28, 2012.
Computerworld Article by Tony Bradley.
I'll believe it when I see it. I note it took YEARS!!!! for the industry to agree on the 802.11n protocols, then it took YEARS MORE!!!! for them to come out of "802.11n (DRAFT)" and migrate entirely to 802.11n final.
Bill, We will see 802.11ac available sooner than you think. The Netgear 802.11ac draft unit coming out or it's already out. Now that everyone has settled in to the N standard they need something new to market to the public. In a few years when ac becomes common they will switch again. I'm still content with the G standard. As a matter of fact I have two N routers running and I adjusted the settings on both to run in G mode. I also turned down the output to 75% power because I didn't want to blast myself with more 2.4Ghz radiation than I have to (I already have about ten devices such as bluetooth, phones, etc.. blasting my house and the lower setting is more than enough for coverage) and I don't want to blast my neighbors in an already 2.4Ghz bandwidth signal saturated neighborhood. For Gigabit speeds in my house I simply use wired Gigabit Ethernet.
When I said I'll believe it when I see it, I meant where it said, "...the official 802.11ac specification won't be finalized until sometime in 2013.It it is anything like the 11n standard, it will be several years beyond that.
Yean, and that was the problem with "11n (Draft)". Then Linksys will come out with their "Draft" version. Then D-Link. Then Belkin. Then dozens more. And, like the 11n (Draft), there will be no guarantee Linksys (Draft) will work with Netgear (Draft) so many users will be forced, or encouraged to buy all the same brand again for fear (and rightfully so) mixed brands will not play well together. A "Standard" (not "Draft") is there specifically to ensure inter-compatibility. Marketing weenies use this "Draft" phase to monopolize networks with their brands. IMO, that is not playing fair with the common (non-tech) user.
11n (Draft) took YEARS for the industry to finally decide on a final 11n "Standard" and that was ridiculous. And it was not the engineers that couldn't agree, it was the marketing weenies.
When demand require new protocols, that happens. But it needs to happen like every other protocol, including the previous wireless versions - the "draft" versions (basically the "beta" versions) are there to work the bugs out, then the final version becomes the new Standard. You don't have 30 draft versions for several years.
No argument there. I am just saying, if ac takes as long to go final as 11n did, don't hold your breath for anything new (other than potentially non-compatible draft versions) any time soon.
Yes, in theory, all these devices can be flashed and upgraded to the 11ac "Standard" when it finally does go final. But history (and these forums) tells us theory and real-world often don't see eye-to-eye.
Also note faster wireless speeds do NOT equal faster Internet speeds. It only means your local networking tasks will be faster. You will still be limited by the bandwidth (bottleneck) provided by your ISP service contract.
As seen here, as recently as yesterday, Internet speeds worldwide don't come close to 802.11g's 54Mbps speeds, let alone the faster speeds of 11n, or 1Gb let alone the common 100Mbps Ethernet speeds.
So for larger wireless "local" business networks, 11ac may have an advantage. But for homes and small offices, I don't see anyone going and replacing their old 11n (or 11g - still in wide use) networks.
I hope I am wrong - but I fear it will be a long time off before 11ac is widespread - ESPECIALLY if it costs more.
I will eventually purchase a new wireless router with 802.11ac but just like now I will change its settings to run g only.
You make some good points but it feels like a futility. I can pick up 11 of my neighbors wireless routers in my living room (house, not apartment) so I am getting blasted regardless of what I set mine to. I just leave mine set at N speeds since I have multiple PCs I transfer files between.
I would not get paranoid that RF radiation from your, or your neighbors' wireless networks are lighting up insides. A "crowded" wifi neighborhood simply indicates potential interference issues - and it reiterates the essential need for additional security awareness when using wireless.
If you are worried about radiation, worry about your microwave oven or some other very high wattage transmitting device - especially if getting old and has been moved about several times. If you use your cell phone a lot (like hours a day, every day) use a wired headset, but I don't know where to tell you to put the phone.
Every electronic device is radioactive. Start covering the wallsockets with tinfoil if you're super worried.
Well, no, not exactly. Not radioactive. Radioactivity comes from "unstable" atoms emitting ionizing particles.
All, or almost all electronic devices emit RF "radiation" - but radiation is totally different from radioactivity - despite their similar names.
Yeah, that would be a bit extreme. And if nothing is plugged into the wallsocket, there would be no current flow anyway. So I think we are safe there.
NO DOUBT over exposure to RF radiation is, and should be a concern. It is a serious matter. So I jest, but only at the extremes. Don't worry about anything emanating from within your home - unless you do bounce your microwave oven on the floor once too often. I would be more worried about the affects of all the new RADAR systems for and on aircraft, weather radars, law enforcement radars in cars, on polls, and those little mobile trailers. And coming out on all sorts of new cars for auto-speed adjust cruse controls and automatic parking. Not to mention high wattage TV and commercial radio stations, CAT scans, dental X-Rays, and security check points. And cell phones pressed up close to our brains.
And just for the record (as a certified air traffic control radio technician for many years), tin would be way down the list for potential shielding materials - below silver, copper, gold, aluminum, brass, and bronze. RF screen rooms are typically made of copper mesh. Ours was set in place by a crane and the door sealed like a vault door so nothing could leak out. Yes, we used our to stop RF from leaking "out" when working on transmitters so they would not interfere with operational frequencies. So if it could stop the watts of RF being transmitted, shielding incoming microwatts is a piece of cake.
I note more and more new homes buyers are opting for "safe rooms" - hardened multi-purpose rooms used as shelters during tornadoes. The more fancy ones are used as safe havens during robberies, and some as command and monitoring centers for home security systems. If RFI/EMI or radioactive fallout is a concern, that would be a good room to line with such copper mesh. But note the mesh MUST perfectly and totally seal the room, as if it would need to hold water, and not leak a single drop.
You are getting blasted from all your neighbors electronic transmission devices, from cell phone towers, television and radio stations, microwave ovens, computer RF leakage and more. Your neighbors devices are transmitting from their homes so when they reach you the signal power strength is significantly weakened from 100% down to 50%, 25% or less so I don't put as much importance on their wireless transmission power. But depending on the size of your home the devices in your own home are blasting you with up to 100% power if you are next to them or in the same room and 50% to 75% if they are located in another room.
The dirty secret that really isn't a hidden secret at all of the electronics industry, computer industry and cell phone industry is that the power coming from these devices has been scientifically and biologically proven to effect human cells and their structures. There is also a direct impact on the human brain and nervous system in ways that are difficult to measure. There is no question about that fact. The only questions are in what ways does it impact human cells? What intensities and/or what frequencies are required to have a negative effect? What length of exposure causes negative effects? What are the negative effects?
Cell phones, cordless phones and Wifi devices are dangerous because you put the phones right next to your head and they are transmitting up to one watt of power millimeters from your brain and it is extremely sensitive to electromagnetic radiation. Large standard microwave ovens use around 1000 watts to cook your food from room temperature to hot in seconds/minutes. They cover the same spectrum as all these electronic data transmission devices we use. What are the effects of one watt directly next to your head for minutes/hours every day 365 days a year for years/decades? What if you include the radiation from ALL the electronic transmission devices?
The fact is that the human nervous system, brain and body are being constantly bombarded with more electronic device generated radiation than ever before. I absolutely guarantee that some day the harmful impact of this will be brought to light. I am not "paranoid" about this. It's simply a scientific fact that it has some kind of impact on the human nervous system and the cells of the body. The only question is what kind of impact and how much of an impact it has. Radio and television radiation has been common for almost the last hundred years but their intensity and proximity is minimal in comparison to what the new devices are transmitting. These new wireless transmission devices (cell phones, wifi, etc.) have only become extremely common in the last decade or two.
My suggestion to anyone who has a thinking brain and a living body is to do everything you can to protect it. Use wired connections wherever you can and yes...keep the laptops using wireless out of your lap, the cell phone away from your bed and the wireless power on your router turned down only to what you need to broadcast in your home. While cell phone transmission power has been reduced in the last decade they modern cordless phones are a preferred option to cell phones that have to connect to the closest cell tower because the newer cordless phone models are designed to radiate much less transmission power (to save battery life) when they are closer to their base in your home or office. Ideally use corded phones when possible.
EMF radiation is everywhere there are electrical power lines so worrying about it doesn't mean you can effectively escape being impacted by it. Just because it's everywhere in modern cities doesn't mean it doesn't have an impact that could be concerning. Have you ever seen those videos of high power lines where people stand underneath them with a florescent tube and watch it glow? There is a reason why regulations are now put in place that demand extreme high power lines are a safe distance away from homes and schools. They didn't use to have the same restrictions until evidence of potential health and safety hazards were verified. In twenty to fifty years I can guarantee you the common person will have a different take on this subject...not too different than the way people have changed their opinions about cigarette smoking.
I believe Bill gets it.
Bill, it is also my understanding that you are an advocate of wired Ethernet as opposed to wireless.
I am, but for security reasons, not for any concerns over RF Radiation.
While wireless networks can be secured, it takes more effort and expertise to do so, and then to stay on top of it. And while that extra effort and expertise may be minimal, it is still required and therefore represents vulnerabilities not encountered with wired.
Dawn of a new wireless: first 802.11ac router available today
Security is just another advantage of wired Ethernet over wireless. As long as you run quality cable reliability of speed and stability is also better with Ethernet. I wasn't trying to imply that radiation from wireless was your reason for using wired Ethernet. I remember you stating before that you liked wired Gigabit Ethernet over wireless. It happens to be the case that the decreased radiation issue is just an added plus even if you don't believe in the health concerns about excess radiation from wireless devices as important. There is also the advantage that not having to use wireless 2.4 Ghz or 5 Ghz bands frees up the wireless spectrum for your other wireless devices and your neighbor's wireless devices.
Great! Now Buffalo gets bragging rights to say in their promotional ads that they were the first company to release an 802.11ac device to market. I can't wait to rush out right away and buy one.
This is a deal breaker. I have had problems with the 5 Ghz signals reaching the top right bedroom of my house from the router in the basement. Only 2.4 Ghz signals reach there. So, if they are entirely excluding the 2.4 Ghz band from the "ac" standard, then I will be forced to stick with the "n" standard.
I don't think that guy has his facts straight. If you look here,
I think you're getting confused. Whilst the standards are compatible, he's talking about the broadcast frequency. If you have a 2.4Ghz device it's not magically going to pick up the 5Ghz only ac router.
The way I am reading it is 11ac will be able to support 11n/g devices. Of course, if the maker of the WAP (wireless access point - we are not really talking about routers here, though the WAP may be integrated "WITH" a router, and typically a 4-port Ethernet switch too) does not include legacy support, then you are stuck(?) with an ac only network. But then again, mixing protocol versions has it's always had its share of problems too.
Unless the author would care to share his source, I will go by the link I provided.
The link you have provided does not explicitly state that "ac" will run on 2.4 GHz. Look at the band entry in the provided table, "n" is listed with both 2.4 and 5 GHz band while "ac" is only 5 GHz.
I do not know what they mean with backward compatibility in this case.
Here is the official website saying that 802.11ac operates exclusively in the 5 GHz spectrum.
I guess, vendors will have to install the 2.4 GHz band separately in their products and keep it at 802.11n standard. This is what Buffalo did with their first "ac" based router.
I wonder how they managed to increase the coverage area using the 5 GHz frequency, as waves at this frequency are not able to go through walls as easily as the waves in 2.4 GHz range. I am guessing they had to install very high power transmitting antennas and I am guessing this will ensure the routers will be very expensive.
Your link goes to the same so called "official site" I linked to - and where it clearly says it will support previous versions. FTR, that is not the official site. That is Broadcoms marketing site and it is their "branding" calling it 5GWifi. The official term is 802.11ac.
So obviously, more information is needed but I note there is a difference between a protocol, and devices that support that protocol. A device can certainly support, and potentially integrate more than one protocol within that device.
This paper suggests it will be backwards compatible with other 5GHz devices - notably 11n devices. That's more in line with what you are saying, and I tend to agree with that. But that in no way suggests or implies that Netgear cannot make an 802.11ac device that also supports (via "dual-band") slower protocols.
So I think it is safe to say that 11ac is backwards compatible with 11n devices.
Yes, I agree we need more information. I guess whatever they do with the new standard, they need to put a 2.4 GHz transmitter in the routers, otherwise the 802.11ac will not be able to reach a user behind multiple walls.
Naw. I don't buy that - not as a general blanket statement. 11ac is specifically designed to provide better coverage (fewer blindspots) over a greater range. Yes, lower frequencies do pass through some materials better than higher frequencies (due to longer wavelengths) but the difference between 2.4GHz and 5GHz, which are clearly in the same RF Frequency Band, is not that dramatic. Therefore, those same walls are likely to affect either frequency in a similar fashion (with some notable exceptions).
Also, I note frequency alone is not a determining factor for range.
And of course, the number of walls (and floors and ceilings) are a factor, as is the composition of those barriers, and the number and types of pipes and wires in those walls matter too.
And 5GHz is less susceptible to interference, and that alone can greatly impact range and performance as the 2.4GHz band is very crowded.
Finally, I am not sure range matters that much for must home users - as long as they can connect. Sadly, that is often determined by antenna placement. But in any case, it does not cost a much to extend the range, if needed, and still keep the superior bandwidth - which is what it all boils down to.
They would need to put them in anyway for real backwards compat with a/b/n 2.4Ghz receivers.
On another note, why don't they scrap a and b support? Surely there is some overhead here that they could remove?