What is 802.11? Wi-Fi standards and speeds explained

In the world of wireless, the term Wi-Fi is synonymous with wireless access, even though the term Wi-Fi itself (and the Wi-Fi Alliance) is a group dedicated to interoperability between different wireless LAN products and technologies.

In the world of wireless, the term Wi-Fi is synonymous with wireless access, even though the term Wi-Fi itself (and the Wi-Fi Alliance) is a group dedicated to interoperability between different wireless LAN products and technologies.

The standards themselves are part of the 802.11 family of specifications written by the IEEE, eachwith its own letter code after the intial 80211, such as “802.11b” (pronounced “Eight-O-Two-Eleven-Bee”, ignore the “dot”) and “802.11ac”. This alphabet soup that began in the late 1990s continues evolve, with improvements in throughput and range as we race to the future to get faster network access.

Along the way, improvements are being made by adopting new frequencies for wireless data delivery, as well as range improvements and reduced power consumption, to help support initiatives like “The Internet of Things” and virtual reality.

If it’s been some time since you’ve paid attention to all of the different letters of the 802.11 standards, here’s an update of where we’re situated with the physical (PHY) layer standards within 802.11, listed in reverse chronological order. At the bottom there are descriptions of standards still in the works.

802.11ah

Also known as Wi-Fi HaLow, 802.11ah defines operation of license-exempt networks in frequency bands below 1GHz (typically the 900 MHz band), excluding the TV White Space bands. In the U.S., this includes 908-928MHz, with varying frequencies in other countries. The purpose of 802.11ah is to create extended range Wi-Fi networks that go beyond typical networks in the 2.4GHz and 5GHz space (remember, lower frequency means longer range), with data speeds up to 347Mbps. In addition, the standard aims to have lower energy consumption, useful for Internet of Things devices to communicate across long ranges without using a lot of energy. But it also could compete with Bluetooth technologies in the home due to its lower energy needs. The protocol was approved in September 2016 and published in May 2017.

802.11ad

Approved in December 2012, 802.11ad is very fast – it can provide up to 6.7Gbps of data rate across the 60 GHz frequency, but that comes at a cost of distance – you achieve this only if your client device is situated within 3.3 meters (only 11 feet) of the access point.

802.11ac

Your current home wireless router (if you like keeping up with advances in the space) is likely an 802.1ac router that operates in the 5 GHz frequency space. With Multiple Input, Multiple Output (MIMO) – multiple antennas on sending and receiving devices to reduce error and boost speed – this standard supports data rates up to 3.46Gbps. Some router vendors include technologies that support the 2.4GHz frequency via 802.11n, providing support for older client devices that may have 802.11b/g/n radios, but also providing additional bandwidth for improved data rates

802.11n

The first standard to specify MIMO, 802.11n was approved in October 2009 and allows for usage in two frequencies – 2.4GHz and 5GHz, with speeds up to 600Mbps. When you hear wireless LAN vendors use the term “dual-band”, it refers to being able to deliver data across these two frequencies.

802.11g

Approved in June 2003, 802.11g was the successor to 802.11b, able to achieve up to 54Mbps rates in the 2.4GHz band, matching 802.11a speed but within the lower frequency range.

802.11a

The first “letter” following the June 1997 approval of the 802.11 standard, this one provided for operation in the 5GHz frequency, with data rates up to 54Mbps. This came out later than 802.11b, causing some confusion in the marketplace, since 802.11b products couldn’t work with 802.11a products due to the different frequency band.

802.11b

Released in September 1999, it’s most likely that your first home router was an 802.11b router, which operates in the 2.4GHz frequency and provided up to 11 Mbps of data rate. Interestingly, products hit the market before 802.11a, which was approved at the same time but didn’t hit the market until later.

802.11-1997

The first standard, providing up to 2 Mbps of data rate in the 2.4GHz frequency. It provided a whopping 66 feet of coverage indoors (330 feet outdoors), so if you owned one of these routers, you probably only used it in a single room.

Pending Wi-Fi standards
802.11aj

Also known as China Millimeter Wave, this defines modifications to the 802.11ad physical later and MAC layer to enable operation in the China 59-64GHz frequency band. The goal is to maintain backward compatibility with 802.11ad (60GHz) when it operates in that 59-64GHz range and to operate in the China 45GHz band, while maintaining the 802.11 user experience. Final approval was expected in November 2017.

802.11ak

There are some products in the home-entertainment and industrial-control spaces that have 802.11 wireless capability and 802.3 Ethernet function. The goal of this standard is to help 802.11 media provide internal connections as transit links within 802.1q bridged networks, especially in the areas of data rates, standardized security and quality-of-service improvements. Approval was expected in November 2017.

802.11ax

Known as High Efficiency WLAN, 802.11ax aims to improve the performance in WLAN deployments in dense scenarios, such as sports stadiums and airports, while still operating in the 2.4GHz and 5GHz spectrum. The group is targeting at least a 4X improvement in throughput compared to 802.11n and 802.11ac., through moreefficient spectrum utilization. Approval is currently estimated to be in July 2019.

802.11ay

Also known as Next Generation 60GHz, the goal of this standard is to support a maximum throughput of at least 20Gbps within the 60GHz frequency (802.11ad currently achieves up to 7Gbps), as well as increase the range and reliability. The standard is expected to be approved between September and November 2019.

802.11az

Called Next Generation Positioning (NGP), a study group was formed in January 2015 to address the needs of a “Station to identify its absolute and relative position to another station or stations it’s either associated or unassociated with.” The goals of the group would be to define modifications to the MAC and PHY layers that enable “determination of absolute and relative position with better accuracy with respect to the Fine Timing Measurement (MTM) protocol executing on the same PHY-type, while reducing existing wireless medium use and power consumption, and is scalable to dense deployments.” The current estimate on approval of this standard is March 2021.

802.11ba

Otherwise known as “Wake-Up Radio” (WUR), this isn’t a crazy morning zoo crew thing, but rather a new technology aimed at extending the battery life of devices and sensors within an Internet of Things network. The goal of the WUR is to “greatly reduce the need for frequent recharging and replacement of batteries while still maintaining optimum device performance.” This is currently expected to be approved in July 2020.

Keith Shaw was a former Network World senior editor and writer of the Cool Tools column. He is now a freelance writer and editor from Worcester, Mass.

Join the Network World communities on Facebook and LinkedIn to comment on topics that are top of mind.

Wi-Fi standards and speeds explained, compared

In the world of wireless, the term Wi-Fi is synonymous with wireless access, even though the term Wi-Fi itself (and the Wi-Fi Alliance) is a group dedicated to interoperability between different wireless LAN products and technologies.

In the world of wireless, the term Wi-Fi is synonymous with wireless access, even though the term Wi-Fi itself (and the Wi-Fi Alliance) is a group dedicated to interoperability between different wireless LAN products and technologies.

The standards themselves are part of the 802.11 family of standards, courtesy of the IEEE. With terms such as “802.11b” (pronounced “Eight-O-Two-Eleven-Bee”, ignore the “dot”) and “802.11ac”, the alphabet soup of standards that began in the late 1990s continues to see improvements in throughput and range as we race to the future to get faster network access.

Along the way, improvements are being made by adopting new frequencies for wireless data delivery, as well as range improvements and reduced power consumption, to help support initiatives like “The Internet of Things” and virtual reality.

If it’s been some time since you’ve paid attention to all of the different letters of the 802.11 standards, here’s an update of where we’re situated with the physical (PHY) layer standards within 802.11, listed in reverse chronological order. At the bottom there are descriptions of standards still in the works.

802.11ah

Also known as Wi-Fi HaLow, 802.11ah defines operation of license-exempt networks in frequency bands below 1GHz (typically the 900 MHz band), excluding the TV White Space bands. In the U.S., this includes 908-928MHz, with varying frequencies in other countries. The purpose of 802.11ah is to create extended range Wi-Fi networks that go beyond typical networks in the 2.4GHz and 5GHz space (remember, lower frequency means longer range), with data speeds up to 347Mbps. In addition, the standard aims to have lower energy consumption, useful for Internet of Things devices to communicate across long ranges without using a lot of energy. But it also could compete with Bluetooth technologies in the home due to its lower energy needs. The protocol was approved in September 2016 and published in May 2017.

802.11ad

Approved in December 2012, 802.11ad is very fast – it can provide up to 6.7Gbps of data rate across the 60 GHz frequency, but that comes at a cost of distance – you achieve this only if your client device is situated within 3.3 meters (only 11 feet) of the access point.

802.11ac

Your current home wireless router (if you like keeping up with advances in the space) is likely an 802.1ac router that operates in the 5 GHz frequency space. With Multiple Input, Multiple Output (MIMO) – multiple antennas on sending and receiving devices to reduce error and boost speed – this standard supports data rates up to 3.46Gbps. Some router vendors include technologies that support the 2.4GHz frequency via 802.11n, providing support for older client devices that may have 802.11b/g/n radios, but also providing additional bandwidth for improved data rates

802.11n

The first standard to specify MIMO, 802.11n was approved in October 2009 and allows for usage in two frequencies – 2.4GHz and 5GHz, with speeds up to 600Mbps. When you hear wireless LAN vendors use the term “dual-band”, it refers to being able to deliver data across these two frequencies.

802.11g

Approved in June 2003, 802.11g was the successor to 802.11b, able to achieve up to 54Mbps rates in the 2.4GHz band, matching 802.11a speed but within the lower frequency range.

802.11a

The first “letter” following the June 1997 approval of the 802.11 standard, this one provided for operation in the 5GHz frequency, with data rates up to 54Mbps. This came out later than 802.11b, causing some confusion in the marketplace, since 802.11b products couldn’t work with 802.11a products due to the different frequency band.

802.11b

Released in September 1999, it’s most likely that your first home router was an 802.11b router, which operates in the 2.4GHz frequency and provided up to 11 Mbps of data rate. Interestingly, products hit the market before 802.11a, which was approved at the same time but didn’t hit the market until later.

802.11-1997

The first standard, providing up to 2 Mbps of data rate in the 2.4GHz frequency. It provided a whopping 66 feet of coverage indoors (330 feet outdoors), so if you owned one of these routers, you probably only used it in a single room.

Coming soon or already here

802.11aj

Also known as China Millimeter Wave, this defines modifications to the 802.11ad physical later and MAC layer to enable operation in the China 59-64GHz frequency band. The goal is to maintain backward compatibility with 802.11ad (60GHz) when it operates in that 59-64GHz range and to operate in the China 45GHz band, while maintaining the 802.11 user experience. Final approval was expected in November 2017.

802.11ak

There are some products in the home-entertainment and industrial-control spaces that have 802.11 wireless capability and 802.3 Ethernet function. The goal of this standard is to help 802.11 media provide internal connections as transit links within 802.1q bridged networks, especially in the areas of data rates, standardized security and quality-of-service improvements. Approval was expected in November 2017.

802.11ax

Known as High Efficiency WLAN, 802.11ax aims to improve the performance in WLAN deployments in dense scenarios, such as sports stadiums and airports, while still operating in the 2.4GHz and 5GHz spectrum. The group is targeting at least a 4X improvement in throughput compared to 802.11n and 802.11ac., through moreefficient spectrum utilization. Approval is currently estimated to be in July 2019.

802.11ay

Also known as Next Generation 60GHz, the goal of this standard is to support a maximum throughput of at least 20Gbps within the 60GHz frequency (802.11ad currently achieves up to 7Gbps), as well as increase the range and reliability. The standard is expected to be approved between September and November 2019.

802.11az

Called Next Generation Positioning (NGP), a study group was formed in January 2015 to address the needs of a “Station to identify its absolute and relative position to another station or stations it’s either associated or unassociated with.” The goals of the group would be to define modifications to the MAC and PHY layers that enable “determination of absolute and relative position with better accuracy with respect to the Fine Timing Measurement (MTM) protocol executing on the same PHY-type, while reducing existing wireless medium use and power consumption, and is scalable to dense deployments.” The current estimate on approval of this standard is March 2021.

802.11ba

Otherwise known as “Wake-Up Radio” (WUR), this isn’t a crazy morning zoo crew thing, but rather a new technology aimed at extending the battery life of devices and sensors within an Internet of Things network. The goal of the WUR is to “greatly reduce the need for frequent recharging and replacement of batteries while still maintaining optimum device performance.” This is currently expected to be approved in July 2020.

Keith Shaw was a former Network World senior editor and writer of the Cool Tools column. He is now a freelance writer and editor from Worcester, Mass.

Join the Network World communities on Facebook and LinkedIn to comment on topics that are top of mind.

IDG Contributor Network: Why your network needs the power of a platform

Evolving your network to have the characteristics of a platform is a core requirement of the Pervasive Network. The goal is to have a network that delivers greater value by reducing operational costs, while allowing for the rapid addition of new functionality and services to consumers, wholesale players or Industry X.0 – the emerging modern enterprise defined by cyber-physical production systems that combine communications, IT, data and physical elements.

Evolving your network to have the characteristics of a platform is a core requirement of the Pervasive Network. The goal is to have a network that delivers greater value by reducing operational costs, while allowing for the rapid addition of new functionality and services to consumers, wholesale players or Industry X.0 – the emerging modern enterprise defined by cyber-physical production systems that combine communications, IT, data and physical elements. It is the enabler of many of the new technologies – 5G, VR and IoT – that are driving massive industry disruption and bringing new consumer services and industry solutions to market. To achieve this the network needs to adopt the characteristics of a successful software platform such as Google, Amazon or Facebook:

Easy to collaborate with and plug-inFlexible and agile (software-based)Scalable and secure with low administrative overheadOffering high value to attract ecosystem participants

These platform attributes can only be realized if the network evolves from a hardware-based architecture to a software-defined model with much of functionality virtualized. The scale of this challenge varies greatly depending on the scope, complexity and age of the network and your business and operating model. Net new companies will have modern, software-defined networks with virtualized functionality; companies with IT systems that are 20, 30 or 40 years old have a more daunting challenge. The state of your network determines your starting point, but your end point needs to be determined based on your business and operating model and your available capital and human assets. Keep in mind, there is no real end in this transition – your network functionality, like a platform, needs to be engineered to easily improve, manage and add features and functions as the market demands.

The opportunity is significant, and the downside could be obsolescence, so make sure you do your homework for a successful transition:

Develop a plan that incorporates the needs of the business and customer baseUndergo a comprehensive assessment of your current network infrastructureDetermine a business and operating model that assures you can function profitably as you add new capabilitiesDo a skills assessment to determine the need for training and talentEasy to plug in

Your network platform needs to be easy for others to plug into. You need to plan for today and the future, keeping in mind where API layers are headed and what are the emerging technologies for platform interaction and compatibility. Finding a standard model for your API layer reduces the barrier to entry for potential partners and participants.

Flexible and agile (software-based)

The edges of the network present the opportunity to integrate processes to reduce maintenance and improve scalability and reliability while lowering your operating costs. More importantly the edge is where you will develop the services that will make your offerings more competitive, improve customer satisfaction and by offering new services, generate net new revenue streams. This is where the opportunity to differentiate your brand and offering lies as well as increasing profitability.

New functionality and revenue

Today, many of the consumer services are delivered over the top of the network via internet services such as entertainment, home services like security and energy management. The end points in the home are for the most part very limited in terms of customization. This is already beginning to change with vendors offering consumers the ability to block IP addresses or put time limits on Wi-Fi access. As the network evolves to be an agile programmable software platform, these services will be integrated into the network and provide overall richer services with far greater control at the consumer end, allow for rapid improvement and innovation by the provider and assure a two-way relationship with the customer, providing invaluable insights into usage and preferences.

Moving to a model that is software based presents the opportunity for your network to become more agile, scalable, secure and responsive to the customer needs, but adding agile programming and DevOps best practices are the skills needed to capitalize on this opportunity. These skills are not typically found in the resume of a legacy network system administrator, but rather a software programmer who has been working in a modern digital framework. You need to determine your needs and look at the options of training, hiring or acquiring, or all three, to assure you have the skills resident to maximize the benefit of a flexible, programmable network platform.

Scalable and secure

This is not limited to the consumer market. SDWANs are emerging that will allow enterprises similar benefits in reducing operational overhead and adding functionality for the user and insights into their needs. Considering the ever-growing data deluge and the race to harvest and manage data for competitive insights, the evolution of the network is as relevant in enterprise as in the home. In addition, operators need to contend with the issue of data security and privacy. The functionality that can be programmed into the network layer could prove invaluable in this ongoing race.

Build an ecosystem

One provider cannot and will not do it all. The power of the ecosystems that evolve around platforms is indisputable. Developing a platform that allows ease of integration of functionality and provides value to third-party ecosystem participants are requirements. The communications industry is the best example of this. Partnering, innovating and acquiring are all strategies to remain competitive against the global digital platforms and the network platform is the foundation for building this value for your customers and shareholders.

Get started and have a plan

The move to software defined and virtualized network functionality is inevitable, but it will be gradual. The technology, let alone the expertise, is lagging the potential. Large, legacy networks are highly complex and security concerns are paramount. It will be some time before we see networks that are wholly designed as software platforms. There is a saying in the tech industry that it over promises what will happen in three years, and surprises you with what will happen in five. I think this is true of the evolution of the network as a platform. Regardless of your starting point, it is safe to say, it is time to start getting soft around the edges.

This article is published as part of the IDG Contributor Network. Want to Join?

Join the Network World communities on Facebook and LinkedIn to comment on topics that are top of mind.

IDG Contributor Network: 5 reasons why IT can’t tame the user experience for the network manager

Every vendor today is spewing about the importance of managing the user experience. What this actually means, however, remains a mystery to most, and there are precious few approaches available to help you get a handle on the issue.

Every vendor today is spewing about the importance of managing the user experience. What this actually means, however, remains a mystery to most, and there are precious few approaches available to help you get a handle on the issue.

Good and predictable user experience is no longer negotiable in this age of constant online business communications. Computer networks have effectively become the single most important tool driving corporate productivity.

But user experience is one of the most difficult problems to address, especially on enterprise access networks, because each experience is influenced by a long list of moving parts, many of which are increasingly outside the control of IT.

For network professionals, five main reasons drive why today’s user experience sucks and why it has become so hard for IT staff to tame:

Too much data to analyze and correlate across the stack.Too many disparate management and monitoring tools.More mobile users armed with smart devices accessing services not under IT’s control.No single source of network truth that can be shared among the different IT factions.Lack of a holistic, end-to-end, view of the client experience.

When a user connects to an enterprise network, a series of transactions kick off, each of which can impact the experience, such as connecting to Wi-Fi, authenticating to the network, obtaining an IP address, resolving a URL, getting routed to the appropriate domain, contending for access to an application on a server, the health of the app, etc., etc.

To really understand how the network is behaving from the user’s point of view, volumes of data for each transaction must be correlated and analyzed. The problem is, IT network folks aren’t paid to be data scientists and can’t sit around staring at data 24 hours a day.

Just scouring logs from a single server to figure out why one user can’t authenticate to the network can require an inordinate amount of time, and forget it if the problem involves hundreds or thousands of users.

Oh, and mobility only makes things worse. Smart mobile devices outside IT’s control simply have too many different operating systems to keep track of, each of which behave differently with different parts of the network.

To get by, network managers have been forced to use a mix of discrete vendor and homegrown tools, each of which provide some sort of view into specific parts of the network. But this can actually raise more questions than deliver answers when it comes to finding and fixing problems impacting clients. Is it the Wi-Fi network? Is it ARP? Is it DNS, DHCP, AAA or some application problem? Network managers shouldn’t have to use 10 different tools to figure it all out.

What’s more, the tools a WLAN engineer might use to troubleshoot individual or systemic client issues are most likely much different from the tools the applications or systems admin might use, none of which tell the entire story. Good user experience is predicated on successful transactions up and down the entire network stack.

So what’s next?

Network managers are really looking for is a heterogeneous platform that can be used to provide specific insights into the user experience across the wired and wireless network, the wide area network and application services. They need tools that automate the learning process of how the myriad user devices actually behave with every part of the network and deliver a single source of truth that examines and quantifies every aspect of the client experience. But that is easier said than done.

Suppliers of infrastructure components will tell you their solutions address user experience, but be cynical. They don’t. They might provide some raw data or pretty charts concerning one slivered part of the network, but that’s about it. Since the user experience depends on so many different client transactions, network professionals need to see it all, understand it all with some level of network-wide context. Today that’s just not possible without going broke or crazy.

Instead, new software approaches are emerging that look promising. Far from perfect, these tools leverage recent advances in machine learning and big data analytics and marry them with cloud computing.

Wired and wireless traffic is typically siphoned off the network and delivered to a localized engine that crunches the data, looking for problematic trends and patterns impacting user connectivity.

While these user performance management tools vary from vendor to vendor, the good news is they aren’t tied to a specific infrastructure vendor. And, what’s more, they typically measure and analyze every client network transaction to learn how the network, services and applications are behaving with all client devices, providing a holistic view that exposes issues in the access network.

If there’s a Wi-Fi coverage problem in a specific location, a DNS connectivity issue for a given group of clients or an application response time problem, these solutions will see it and flag it for remediation.

This represents a huge win for IT because these types of problems typically account for a large percentage of the issues impacting user experience.

And while there is no silver bullet when it comes to improving user experience, the emergence of these new tools and technologies are providing a way to gain unparalleled visibility into exactly what’s going on with users on your network, wherever they are, whatever they’re doing.

This article is published as part of the IDG Contributor Network. Want to Join?

Join the Network World communities on Facebook and LinkedIn to comment on topics that are top of mind.

Manage user performance, not the network, with machine learning-based tools

Over the past decade, network management tools have evolved from being fault based to performance based. This has become a critical element in running infrastructure because faults don’t matter as much.

Over the past decade, network management tools have evolved from being fault based to performance based. This has become a critical element in running infrastructure because faults don’t matter as much.

That might seem like a strange thing to say, but consider the fact that critical infrastructure such as switches, routers, Wi-Fi access points and servers are deployed in a way to protect against outages. Infrastructure is built so redundantly today that any hardware device can go down and its likely no one will notice.

Also on Network World: 7 must-have network tools

A bigger problem is managing user performance. Often users call in about a certain application not working well, but when the engineer looks at the dashboard, everything is green. Performance problems are much harder to diagnose and can kill employee productivity.

Earlier this year, I did some research that looked that the impact of poor application performance and found that workers are 14 percent less productive because of application issues. Think about that statistic. Businesses invest billions of dollars every year in technologies to make workers more productive, but if they could just get all the applications they already run to work optimally, employee productivity would increase 10 percent or more.

Focus on the users, not the network

That is why IT professionals need to shift the focus of management from the network to the user. The problem with network management is that by definition, it provides a bottom-up view of the world where user experience is inferred. User performance management (UPM) is more top down where the underlying technology and dependencies are understood, so if a Wi-Fi AP gets overloaded, the network operations team can quickly understand which applications and users will be impact.

It’s important to note that UPM can’t be achieved by taking a fault management tool and retrofitting or putting a new dashboard on it. Instead, UPM is achieved through a combination of data, machine learning and the cloud for scale.

Nyansa updates its Voyance UPM product, adds machine learning

One vendor trying to make UPM a reality is a startup called Nyansa. This week the company announced some updates to its Voyance UPM product that shift it from being a self-contained product to a machine learning platform that provides proactive remediation advice and can share data from third parties, extending Voyance’s value.

The new version now pulls in SYSLOG information from Cisco’s Identity Services Engine (ISE), HPE Aruba’s ClearPass and Free RADIUS. This is an extra data source that provides information into DHCP and DNS issues that can reveal user issues that indicate connectivity problems. This tends to the biggest source of problems on Wi-Fi networks, and Nyansa has about 70 percent of the market covered with just Aruba and Cisco.

The SYSLOG integration will be important as Internet of Things (IoT) deployments become more common. Voyance’s machine learning algorithms can take the SYSLOG data and alter network operations when an IoT device performs at a level that is business impairing.

Nyansa

Nyansa Voyance dashboard

Nyansa also added externally facing APIs to extend the platform. Data from Nyansa can now be exported to IT workflow products such as ServiceNow. Voyance could spot a performance issue and then proactively open a trouble ticket, enabling IT to get a handle on the problem before users report it.

Alternatively, Voyance could send data directly to a communication platform like Slack where data from specific endpoints, applications or users could be sent to a particular group of people. For example, problems reported with a customer-facing application could send information directly to a Slack Room that also includes the experts responsible for the application. This could significantly shorten the “resolution ping pong” that occurs as trouble tickets get passed back and forth before the problem actually gets solved.

Another interesting feature added to Voyance is the ability to easily tag a device or person. The tagging enables engineers to constantly monitor the experience of critical assets or even people. The book Animal Farm taught us we are all equal, but some people are more equal than others — and now IT can know when someone who is more equal, like C-level executives or high-performing salespeople, experience problems. This can also be used to monitor things such as factory floor equipment, heart pumps or other mission-critical IoT devices. This data can also be searched on, filtered or exported for additional analysis.

Nyansa helps IT become predictive rather than reactive

Nyansa uses machine learning to enable IT to move from a reactive model to a predictive one. Voyance now includes a remediation engine that provides IT with a view into where and how network incidents are happening and the impact on user experience. It also recommends specific actions to fix those problems — things IT may not even know are causing problems. For example, the remediation engine could suggest the 2.4 GHz radios on APs be turned off because they cause Wi-Fi problems.

One particularly valuable feature is that the tool now shows how many lost client hours will be recovered or lost by each action taken or not taken. This can help IT operations prioritizes its efforts. Think of this as Splunk on steroids where Splunk shows lots of data but the insights and actions need to be determined by smart engineers who can correlate the data. The machine learning in Voyance does that heavy lifting, so IT can figure out what to fix faster and in an order that is most meaningful to the business.

Managing user performance isn’t done by measuring one particular aspect of the network. It’s more about understanding all the IT elements that make up s service and the relationship between them — and then having the insight to understand how that impacts productivity. With the volume of data available today, this can’t be done manually. It’s important for IT pros to start using machine learning-based tools like Nyansa to solve those tough-to-fix performance problems that kill worker productivity.

Join the Network World communities on Facebook and LinkedIn to comment on topics that are top of mind.

Can Wave 2 handle the wireless tsunami heading toward us?

There seems to be a shift in our industry from wireless 802.11n to 802.11ac, as we have seen large leaps forward in bandwidth and client-saturation handling. With more wireless options in use in the workplace, widespread connectivity continues to rise and wireless requirements are becoming greater and greater.

There seems to be a shift in our industry from wireless 802.11n to 802.11ac, as we have seen large leaps forward in bandwidth and client-saturation handling. With more wireless options in use in the workplace, widespread connectivity continues to rise and wireless requirements are becoming greater and greater.

Now, with Wave 2 becoming more common, is 802.11ac really able to handle the tsunami-like wave of wireless internet requests to meet this growing demand?

Also on Network World: REVIEW: Early Wave 2 Wi-Fi access points show promise

There’s only one way to find out. We need to step out of the comfort zone provided by past wireless technologies and expand the idea of what wireless is capable of providing to meet these demands.

Many people believe 802.11ax, the next standard in wireless LANs, can fulfill their wireless demands. A characteristic of AX is orthogonal frequency-division multiple access (OFDMA). This takes the ideas of MU-MIMO, or talking to more than one client device at once, and explodes it out hundreds of times. But getting 6.8 Gigs from your access point (AP) in the ceiling back to your switch in the closet is going to be difficult, so we’re going to have to mitigate some of these changes. This is where Wave 2 802.11ac may be the answer.

Wave 2 802.11ac AP products

Now that 802.11ac access points address these faster speeds and performance requirements, we are starting to see the rollout the Wave 2 802.11ac AP products.

HPE/Aruba’s 802.11ac Wave 2 access points offer extra features — multi-user MIMO and Bluetooth Low Energy (BLE) being some of the most notable — but they also allow users to take advantage of the new IEEE 802.3bz standard for multi-gig Ethernet.

HPE Smart Rate multi-gig Ethernet allows the possibility of high-speed connections — up to 10GbE over existing cabling infrastructure — as well as power over Ethernet for 802.11ac Wave 2 devices.

Pairing a Wave 2 access point with a switch port capable of multi-gig gives you the potential for increased speeds while also powering those devices. Smart Rate essentially auto-negotiates the connections (after the programming is complete on both ends, of course) and will attempt four connections each at 10 gig, 5 gig, 2.5 gig and gig until it can make a connection at the fastest speed possible.

As more people and devices become connected and the way we interact with and control those devices changes, the load on our networks will only increase — almost to tsunami proportions. We need to break away from past wireless limitations and innovate. Wireless 802.11ac may be the first big step in the right direction. Only time will tell.

Join the Network World communities on Facebook and LinkedIn to comment on topics that are top of mind.

iPhone X review: Camera tricks, swiping up — and that OLED – CNET

Welcome to my ongoing impressions of the iPhone X.

I’ve entered my third full day with the phone. Day 2 was mostly a blur of media hits, after a nearly sleepless night.

Welcome to my ongoing impressions of the iPhone X.

I’ve entered my third full day with the phone. Day 2 was mostly a blur of media hits, after a nearly sleepless night. Everything you’re reading here is new as of the afternoon of Wednesday, November 1.

I’m a pretty quick adopter of weird, new tech. The iPhone X isn’t even that weird: it’s really an evolved iPhone, with a sharp design and some new ways to use it. But I found, sure enough, that my first day with Apple’s top-end phone was a learning process. Face ID and its log-in process. All the new swipes and gestures and button-presses. Learning to accept The Notch In The Screen.

Now Playing:Watch this: iPhone X: Our first day with Apple’s biggest phone ever
3:26
I kinda like the flicking now

The swipe up move to replace the home button is growing on me. Weird but true. I realized, it’s a bit like flicking away apps after you’re done with them. It’s oddly satisfying. It’s more like, “I’m done with that” than “get me back home.” Even the animation has changed: it feels like you’re flicking the app away to infinity.

I still don’t like using it one-handed, but I’m enjoying the flow of it.

Reachability is back but hard to pull off

Getting to the top of the screen one-handed is possible with Reachability, a feature that drops the top half of the iPhone screen down for better thumb access (it can be toggled on in Accessibility in Settings). It used work via a double-tap on the home button. Now it requires a swipe down on the bottom of the iPhone, off the edge of the screen. In practice, I find it impossible to do, even with two hands. It could help me reach the hidden-away Control Center, which now lives in the top right of the iPhone X screen… but I can’t learn its subtlety yet.

Know you’re ready to buy? Get the iPhone X here:
64GB space gray | 64GB silver | 256GB space gray | 256GB silver

Most people like the design, are curious about the price

Day 2 was spent demoing the phone to people who hadn’t seen it before: on CBS This Morning, with Charlie Rose and Gayle King and Norah O’Donnell, and to Jeremy Piven in the green room. To Vlad Duthiers and Anne-Marie Green on CBSN. (Note that CNET is a division of CBS.) To Jon Fortt, Carl Quintanilla and Sara Eisen at CNBC.

Lots of people gave me their thoughts, and took selfie photos. It seemed like the design didn’t bother anyone, and most people liked it. The selfie camera Portrait Mode was a hit. Animoji charmed people. But the question I kept getting asked is, who’s paying $1,000 for it?

A thousand dollars is a magic number, and really, iPhones have already gotten nearly there with the iPhone 8 Plus. I think the kicker here is that this nice-looking X sits as an extra upgrade option above the 8 Plus, which is already an upgrade option to the 8. By its price alone, it’s not The Phone For Everybody.

Learning the new gestures confused others who tried it briefly. I was already used to how everything worked. Suddenly I became the expert. I realized I had the hang of it.

Great camera, when you learn the tricks

I went trick-or-treating with my kids, taking photos in the twilight with the iPhone X telephoto lens. Then I finally had a real night’s sleep.

But I enjoyed the telephoto camera’s new optical image stabilization. I still had moments of blur, but it seemed to me that, at last, 2X zoom photos looked as good as the wide-angle ones. Zoom feels more powerful as a result.

And I’ve gotten a kick out of the selfie shots. I learned that the Portrait Mode on the selfie camera tends to favor one person and keep them in focus, but if you stay close together, it’ll help keep two people in focus at once. Or, you can choose to not have Portrait Mode on at all. Also, selfie portrait mode doesn’t work with backgrounds that are too far in the distance: it needs something in middle distance, generally, in my experience. Otherwise it’ll just default to a regular selfie photo.

Yeah, I know flash is better, but I still never use flash.

OLED is so much better in the dark

I woke up to the glow of the iPhone X next to me. I picked it up in the darkness and suddenly realized, oh, yeah, OLED.

I’ve used plenty of OLED phones before (Samsung’s lineup in particular, including the Galaxy Note 8). I first thought the iPhone 8 Plus display, casually viewed, was sometimes similarly good-looking to the iPhone X, but in dimmer conditions X definitely edges it out for a more vivid experience.

The keyboard feels like a lost opportunity

I hadn’t typed much on the iPhone X in my first hours with it. Then I started taking notes. I saw what others were commenting on: the keyboard has a lot of empty space underneath its virtual keys. What’s going on?

The X keyboard pops up in a part of the phone similar to where it is on the iPhone 8 and 8 Plus. It fits my thumbs. The keys feel narrower than the 8 Plus’, because the display isn’t as wide. It feels fine. But the large amount of display area underneath — easily enough for another set of keys or functions — seems odd. An emoji and dictation button are put down there, but why not anything else? It feels like space was left for a phantom home button that’s not there. But still, I typed quickly. It’s not bad, certainly, but it feels like the space could be used a lot more efficiently. Something for iOS 12, maybe — or could we see a change sooner?

Battery life isn’t fantastic

We’re doing full battery tests, but I found I needed recharging midday in my first few days. A morning commute on the train left me at about 70 percent by 10:30am after getting up at 7am. This is similar to what I get on the 8 and even the 8 Plus. (Yes, I’m spending a lot of time showing this phone off and running apps — but I’m also splitting time in my day between the X and an iPhone 8 Plus.) The point is, the X isn’t conquering new battery frontiers, at least in these early days.

Updated November 1, 4:29 p.m. PT. My earlier thoughts, first published Tuesday at 3 a.m. PT and updated thereafter, follow.

The iPhone X feels like a concept car, or a secret project. That’s because of the X name, probably, and the legacy of 10 years of iPhones. It’s also the fact that this is an optional step-up model — like an 8 Plus, but smaller. It’s a bold new design, different after three years of each iPhone looking very much the same.

I love new technology and the wild ideas that come with it. I love to be immersed in new concepts. But I’m also practical when it comes to tools. Will I use a fully rethought phone? Will it work for me when I need it to? My phone is my mission critical everything. It’s my Indiana Jones hat. Will Face ID work as well as the trusty Touch ID home button? Will I feel safe?

Ultimately the all important question is simple: Is this *the* must-have upgrade? Should my mom get it? Should my sister? My brother-in-law? My best friend? You?

I’ve spent a day now with the device to begin to answer this question. Consider this a living review that we’ll be updating throughout the week — and beyond — as we test, retest and experience the iPhone X.

Face ID works pretty well…

You’ve been able to unlock an iPhone with Touch ID using your fingerprint since 2013. The original iPhone shipped with a home button a decade ago. Apple‘s making a big leap by getting rid of both in one fell swoop and replacing them with Face ID. Your face — or a passcode — is the only way to unlock the iPhone X.

Face ID worked well in early tests. Setup is quick: Two circular head twists and the iPhone adds your face to its secure internal database.

Unlocking isn’t automatic. Instead, the phone “readies for unlock” when it recognizes my face. So I look at the iPhone, and then a lock icon at the top unlocks. But the iPhone still needs my finger-swipe to finish the unlock. It’s fast, but that extra step means it’s not instantaneous. Face ID did recognize me most of the time but sometimes, every once in a while, it didn’t.

I tried the phone with at least five of my coworkers. None of their faces unlocked it — although none of them look remotely like me. I also attempted to unlock it with a big color photo of my face on a 24-inch monitor, but that didn’t register as a face to the iPhone X either. The TrueDepth camera recognizes face contours to identify you.

Face ID worked perfectly in an almost completely dark room, too, lit only by the iPhone’s screen. (It uses infrared). We’ll still need to do a lot more testing to see what Face ID’s limits are. By default, it requires “attention” at the display, but that requirement for direct attention can be turned off for those who need it, or those who prefer to speed up the process.


57
iPhone X: Up close and personal with Apple’s new phone

…But it’s not perfect

By design, the iPhone X doesn’t unlock with just a glance. Once you’ve identified yourself with your face, you need to swipe up with your finger to get to your apps. Not only does the swipe remove the immediacy of Face ID, it means you need your hand to do anything. Quick access to the phone wasn’t quite as quick as I expected.

I pushed my face testing hard. I got a haircut, shaved my beard into several shapes, then off completely. I tried on sunglasses and other frames. I wore hats and scarves. Then I went to more absurd levels, including some that wouldn’t happen in most real-world scenarios, trying on wigs, fake mustaches and steampunk goggles.

The preliminary results are in my video. This is by no means a final test, but the bottom line is that most of the “real world” tests worked and showed me that Face ID is more resilient than I expected. Face ID didn’t mind my sunglasses. Scarves presented some challenges, but that makes sense if they’re pulled up over your mouth since they’re hiding essential aspects of your face. All the tests worked far better than Samsung’s face unlock feature on the Galaxy Note 8 — though Samsung kept its fingerprint reader on, as an easy backup.

The iPhone X occasionally asked me to re-enter the passcode after a failed Face ID attempt, then locked out further Face ID efforts until I entered the passcode again. If you’ve used Touch ID, this will remind you of trying to use an iPhone with wet fingers.

The big OLED screen is a welcome addition…

The 5.8-inch screen is the biggest on an iPhone to date, and the first Apple handset to use OLED (organic light-emitting display) technology versus the LED/LCD in all previous iPhones. In addition to better energy efficiency, OLED screens offer much better contrast and true, inky blacks — not the grayish blacks of LCD screens.

At first use, the bigger screen feels great. I’ve wanted more screen real estate on the iPhone, and the X comes closest to all-screen. Picture quality improvement isn’t immediately noticeable over previous iPhones, but that’s a testament to how good Apple’s previous TrueTone displays are. The larger screen gives the iPhone a more current and immersive feel.

I’ll need more time to compare the screen to other iPhones — and to other OLED phones, such as Samsung Galaxy models.

…But the X’s screen feels different from an iPhone 8 Plus

That said, I grappled with a few X display quirks. Sure, there’s a notch cut out of the top of the screen where the front-facing camera array sits. But this isn’t just the Plus display crammed into the body of a 4.7-inch iPhone. The X’s display is taller than recent iPhones — or, when you put it in landscape mode, narrower. For some videos, that means they get letterboxed (black bars at the top and bottom) or pillarboxed (black bars on the left and right) to fit properly and the effective display area ends up a bit smaller than on the 8 Plus.

The rounded edges of the display mean that even if you expand a picture to fill the screen, parts of the image or movie end up cut off.

The notch didn’t bother me — much…

Hear me out. The notch and the two extra bits on either side end up feeling like bonus space: most apps don’t use that area, and it ends up relegated to carrier, Wi-Fi and battery notifications, which saves that info from cluttering the display below.

…But your favorite apps might not make the most of that screen

Many current apps aren’t yet optimized for the iPhone X. These outdated apps end up filling the same space as on an iPhone 8, leaving a lot of unused area. That’ll certainly get fixed for some apps over time, but it’s a reminder that the extra screen room here might not end up meeting your needs, until or unless the apps are optimized.

Living without the home button takes some adjustment

A number of new gestures take the place of the old home button. I kept reaching for the phantom button over the first few hours, feeling like I’d lost a thumb.

Unlike phones such as the Samsung Galaxy Note 8, which adds a virtual home button to create a “press for home” experience, the X remaps familiar gestures completely.

Swiping down from the corner now gives you Control Center, instead of swiping up.Swiping up is the new “home button.”Swiping up and holding brings up all open apps. And another new trick: swiping left or right on the opaque bar below all apps flips between apps for quick multitasking.

Meanwhile, there’s a new, large side button that brings up Siri and Apple Pay. I instinctively pressed and held it to shut down my phone, then I realized that is not what that button does. (To turn off the phone, you now hold that same side button *and* the lower volume button at the same time, which feels far from intuitive.)

Those gestures added up to some difficult maneuvers as I walked Manhattan streets in the Flatiron between my office and a local barber shop. At the end of the first day, I admit: sometimes I missed the simple home button.

Now Playing:Watch this: iPhone X unboxing
2:51
You’ll need to adjust your Apple Pay routine

Double-clicking the side button brings up Apple Pay, but an additional face-glance is needed to authorize a payment. I tried it on our vending machine at the office and sometimes it worked great. Sometimes Face ID didn’t seem to recognize me. Maybe my timing was off.

I’m definitely going to need to check this out at more places in the days ahead. The bottom line: you don’t want to be the guy holding up the line at the drugstore because your double-click-to-Face-ID-to-NFC-reader flow was off.

The rear cameras are similar, not identical, to the iPhone 8 Plus

Like the iPhone 8 Plus, the iPhone X has a dual rear camera with both wide-angle and telephoto lenses. But X has two changes: A larger aperture (f/2.4 vs. f/2.8) on the telephoto lens, and optical image stabilization on both lenses (rather than just one on the 8 Plus), which should make for better-lit, less blurry zoomed-in shots at night or in lower lighting.

My colleague, CNET Senior Photographer James Martin, has done a deep dive on the new front-facing iPhone X camera, experimenting with portraits and shots around San Francisco.

Now Playing:Watch this: iPhone X camera pushes the art of selfies
2:47
The front camera is great with Portrait Mode…

In addition to handling Face ID duties, the TrueDepth front camera brings most of the magic of Apple’s rear cameras to the selfie world.

Portrait Mode, where the subject is in the foreground in focus with a blurred background, and Portrait Lighting, which applies various lighting effects to a photo after the fact, both now work on your selfies. Vanity, thy name is Portrait Mode.


25
Shot on iPhone X: Apple’s latest camera hits the streets

…But not great with Portrait Lighting and my face

Portrait Lighting is officially in beta on both the iPhone’s rear and front cameras, and my experiences with it confirmed Apple isn’t finished perfecting the software that makes it work. My face ended up looking oddly cut-out and poorly lit. Unlike the rear cameras, which seemed to produce hit-or-miss Portrait Lighting shots, I haven’t had luck with my own selfies.

Get ready to be bombarded with animojis, and other TrueDepth AR and face-mapping apps

Animojis are exactly what they sound like: animated emojis. They’re cute. They’re also Apple’s showcase for the fancy TrueDepth camera, which maps your facial expressions onto monkeys, aliens, foxes and even a pile of poop. (If nothing else, the 10-second clips made my kids laugh when I sent them a few.)

Third-party apps also use the TrueDepth camera for real-time 3D effects. Snapchat created new face filters I got to play with, and some did an amazing job staying on my face. I’m curious to see how future apps use this tech for even more advanced face-aware AR.

Apple’s Instagram-like video app Clips has an update coming that also uses the camera to green-screen my face into different scenes, like an 8-bit gaming experience or a Star Wars filter where it looks like my face is a blue-tinged hologram. Again, it’s fun. For many people, the filters Snapchat already provides are probably enough.

Apple nailed the size and feel: Did it nail the entire experience?

I think the X is in the sweet spot that the older iPhone sizes could never perfectly be. It’s a good-feeling phone with a nice, large screen. The shift to Face ID and the removal of the home button feel like changes that some might be fine with, and others will find unnecessary. I’m still learning the X’s design language.

We’re just getting started!

Want to know more? So do we. This is the beginning of our iPhone X journey, not the final word. We’ve got plenty more on deck, including battery tests, benchmarks and in-depth comparisons to rival phones such as the Samsung Galaxy Note 8 and Google PIxel 2 XL.

We’ll continue to update our experiences throughout the week as we count down to the iPhone X global launch on Friday, Nov. 3.

For now, our CNET review of the iPhone X will be ongoing with a lot more tests.

Stay tuned and reach out to @jetscott with your questions on Twitter!

Apple
reading
iPhone X review: Camera tricks, swiping up — and that OLED
Nov 2
The iPhone X reviews are in!
Nov 1
Apple’s Cook takes on Russia’s influence, tax reform on NBC
Nov 1
Want to reserve an iPhone X for in-store pickup? Here’s how

See All

Share your voice

0

Tags
ApplePhonesApple PayiOS 11