It’s been 50 years since a hand-written, back-of-the-napkin drawing would lead to the Ethernet. And it’s been 40 years since the introduction of the first Ethernet standard brought to you by Intel, Xerox and Digital Equipment Corp. (DEC) and supported by a cast of thousands of engineers.
Since then, the evolution of Ethernet technology has enabled the kind of communications infrastructure early sci-fi writers could only dream about.
Yet today, many of us take the lightning-fast responsiveness to internet searches and apps for granted. If we have to wait 5 seconds for a web page to load, we move on to other things, shaking our fists at the technology gods and cursing our virtual solitary confinement.
But high-speed communications are not a birthright. In fact, they’re a hard-won achievement. Along the way there have been many new inventions to enable Ethernet’s multiple speed gains.
You can view Ethernet’s history in 3 eras, says Gary Gumanow, Intel sales enablement manager for Ethernet, who’s witnessed all three. And now we could be entering a fourth. Let’s take a look.
Era 1 (1980 - 2005): Faster yet also smaller
At the 1980s dawned, engineers from Intel, Xerox and DEC came together to author and standardize an IEEE specification for a communications protocol. What they couldn’t know then was that Ethernet would empower the tech world for decades to come.
Their early work was based on ideals of interoperability, openness and the need for both higher speeds and less delay (aka low latency).
While fast data transfer was just one of Ethernet’s many stated goals, it’s mainly how the technology’s first era has been defined. Early engineers worked tirelessly to achieve transfer rates up to and beyond 2.9 megabits per second (Mbps).
They achieved that goal in 1982. That’s the year Intel shipped the world’s first high-volume 10 Mbps Ethernet controller.
Early Ethernet products from Intel
This herculean effort would kick off many years of R&D. Intel provided the world’s first dual-speed 10/100 Mbps network adapter in 1994. Then the first single-chip 10/100 Mbps Ethernet controller in 1997. And then the first single-chip 10/100/1000 Mbps controller in 2001.
Intel’s 82557 10/100 Ethernet Adapter
Why was a single-chip network controller so revolutionary? Mainly because it ushered in LAN on Motherboard (LOM). This made Ethernet the ubiquitous connection for desktops, servers and embedded devices. Ethernet for everyone!
Era 2 (2005 - 2015): Server virtualization
The path to Ethernet’s second era, virtualization, was forged in 1999 with the standardization of Gigabit Ethernet (aka IEEE 802.3ab). The first compatible products came to market just 2 years later, in 2001.
In a way, Gigabit Ethernet was a solution looking for a problem. That’s because servers running a single application could barely push a Gigabit of data down the wire. Gigabit Ethernet provided more bandwidth than data centers needed.
Ah, but then came virtualization. This meant multiple applications could be consolidated on a single physical server, each application running with its own virtual operating environment. As Intel’s Gumanow said back then, “Every virtual machine needs a Gigabit Ethernet connection.”
Each VM was given its own dedicated, physical 1gigabit Ethernet (GbE) connection. This finally enabled data-center architects to begin using more of their available server resources. Now they could consolidate physical servers, increasing efficiency and creating greater IT scale.
Several years later, we got another 10X improvement with 10Gb Ethernet. Once again, the industry had delivered more bandwidth than we knew what to do with.
But it was also perfect timing for network and server architects. 10GbE solved the problem of sprawling network-adapter ports, cables and switch ports in servers and data-center racks.
But how to virtualize all those network adapters in the server? The solution came from Intel. Through the PCIe SIG, the company introduced Single-Root Input Output Virtualization (SR-IOV).
Most virtualization hypervisors allowed for the emulation of network adapter through software; however, this introduced latency. By contrast, SR-IOV allowed for the splitting up of a network adapter via a “virtual function” on the adapter, instead of running in software on the host processor cores. This also delivered additional improvements in throughput and server efficiencies.
Another technology that helped data-center managers were improvements in network virtualization. One example is Intel’s Virtual Machine Device Queue (VMDQ). This technology aligns transmit and receive queues in hardware with a specific VM host network adapter.
These advances in both hardware and software have enabled virtualization to be the scale engine of the cloud. Today, VMs can host many customer workloads scaled across a single server platform in the cloud.
But networking still needed a method for keeping data and flows secure and separate from one another. Fortunately, from Ethernet’s point of view, the cloud is simply a vast array of remote data centers that collect, process and disseminate data for a wide variety of applications and users. What’s remarkable is the quantity of that data: nearly 100 zettabytes so far.
Data-center virtualization, enabled by Ethernet technology, also gave us access to fast, reliable email and vast databases. It still powers many modern conveniences used today.
Today’s data centers get more done with virtualization
Another benefit of virtualization: It can reduce a data center’s carbon footprint. By enabling multiple VMs on a single physical server, virtualization has helped IT managers use electric power more efficiently.
The resulting decrease in power consumption for processing, cooling and management may not be the silver bullet we need to reverse climate change. But at least it’s a step in the right direction.
Era 3 (2015 - Present): Optimization
Improving how a technology works may not be as sexy as inventing it in the first place. But it’s no less valuable.
Today, Intel and others are considering how best to upgrade Ethernet by improving application responsiveness and reducing latency.
Bandwidth continues to improve with 40GbE, then 25GbE and 100GbE. It should soon move to 200GbE and beyond.
“A chain is only as strong as its weakest link,” Gumarov reminds us. “As apps scale, their responsiveness becomes less predictable. Application Device Queues provides the mechanisms to scale the pipeline between the network, compute and storage.”
Optimization is the process of reclaiming that predictability. It’s a big challenge given the titanic scale on which Ethernet-based systems now operate.
Yet the resulting systems could catalyze advances in mission-critical applications, such those used by healthcare and aerospace, where failure simply isn’t an option.
Era 4: Will there be one?
While the future is uncertain, it’s safe to say Ethernet tech will be with us for some time to come.
Sure, each day brings the possibility of technological advances that offer faster speeds and higher reliability. In fact, many have already come and gone. But it would still take many years to replace the infrastructure that enables Ethernet to push data to even the farthest corners of the planet.
Ethernet is the unseen thread in the fabric of our digital lifestyles. It’s as vital to us now as it’s ever been. Ethernet will still be vital when we wake up tomorrow morning and log on to our devices to check the weather, start a Zoom meeting and navigate to a new destination.
So will the Ethernet have a fourth era? That may be the wrong question. Maybe the right one is: How do we know we’re not already in it?
Edge computing is like a modern car engine: It’s something many people use every day, but hardly anyone understands.
One good definition comes from Alex Reznik of the ETSI standards organization. He says, “Anything that's not a traditional data center could be the 'edge' to somebody."
Is that the perfect explanation of edge computing? Probably not. But it does illustrate the edge’s ubiquity — and just how difficult agreeing on a definition can be.
The concept of the edge may seem amorphous. But it’s not impossible to pin down.
Where’s the edge?
Simply put, the edge is wherever data gets created. As for edge computing, it’s any computing that’s done at or near the data’s source.
The edge: computing done at or near the data’s source
An edge device could be as simple as your PC or smartphone. Your smart speakers or Internet of Things (IoT) devices. Point-of-sale (POS) terminals, autonomous cars, and the networking devices in your office, too. Basically, any device that’s not located in a central data center can be considered out on the edge.
The edge can also refer to devices that are one or more steps removed from the end user.
For instance, server nodes designed for gaming (aka gamelets) are edge computers that process, store and deliver data to gaming devices. These gamelets can offer the lower latency required by modern games better than the cloud because they’re physically closer to their end users.
What the edge is not
Edge computing is not Amazon Web Services (AWS), Microsoft Azure, or any of the myriad public or private data centers operated by Google, Apple, Dropbox, Facebook and the like.
These data centers are often referred to as the core or the cloud. You can think of them as central repositories for data coming from the edge.
These data centers also serve as the central processing systems that analyze and act on that data.
However, a device operating within these mega data centers — with the exception of the servers — could also be considered an edge-computing device.
Whether a given device is involved in edge computing has everything to do with both how we interface with it — and how it interfaces with the core.
A 2018 report by Gartner offers a stark contrast between core and edge computing. Four years ago, the research firm estimated that only 10% of the world’s data was created outside traditional centralized data centers or clouds.
But a more recent report, this one by IDC, shows the focus quickly shifting from core to edge. IDC predicts worldwide spending on edge computing this year will rise 15% over 2021, reaching a total of $176 billion.
Looking ahead to 2025, IDC predicts that spending on edge computing that year will top $274 billion. “Edge computing continues to gain momentum as digital-first organizations seek to innovate outside the data center,” says IDC researcher Dave McCarthy.
All that money will feed more than 150 edge use cases, he adds. These will include content delivery, virtual networks, smart grids, freight monitoring, smart transportation, even AR-assisted surgery.
The personal is professional
How is an increase like that possible in a relatively short amount of time? The answer lies in the proliferation of professional and consumer electronics.
Are you reading this on a tablet? Is there a cellphone in your pocket and a smart speaker nearby? Does your refrigerator push as much data to the cloud as the Chromecast plugged into your smart TV?
If so, then you’re one of the billions of users creating data at the edge every day.
Computing on the edge: billions served
The more data we create, the more we must rely on edge computing to process and act on that data locally — instead of waiting for it to make a round trip to the cloud and back.
Bandwidth matters, too
Edge computing can also save bandwidth.
Consider, for example, a smart security system that includes 10 cameras. Streaming the data collected by these 10 cameras to the cloud could be a bandwidth nightmare.
However, with an edge-computing component such as a local server node, the footage could first be parsed. Then the server node could decide which data is worth keeping.
The bandwidth required to keep an eye on all those cameras would be just a fraction of what it might have been otherwise.
Your opportunity in edge
Your customers may not know it, but they already rely on the edge to buy, sell, communicate and grow their business every day.
As edge computing evolves, so too does the opportunity for tech providers. The edge is a new market, one that requires complete solutions including end-user and edge computing devices.
Potential vertical markets include healthcare, commerce, manufacturing and gaming. They’re among those whose appetite for edge computing should grow exponentially over the next few years.
Are you ready to grow with them? If not, maybe it’s time to step out onto the edge.
You probably know that Chromebooks are the favorite PCs for K-12 schools, but did you know that Google’s Chrome OS is all grown up and ready for business too?
For proof, you need only look as far as Chrome Enterprise. It’s a business-centric collection of software and devices designed for SMBs and enterprises.
The enterprise version of Chrome OS isn’t all that different from the standard user and education versions, but there are some important add-ons. They include Active Directory integration, advanced security protection, access to enterprise device policies and kiosk mode.
Making the case for Chrome
Why would your customers want RC Cola when they could have Coke or Pepsi? After all, most of the business world runs on Windows.
The answer has much to do with Chrome’s unique operating paradigm. Unlike a traditional OS, Chrome is more of a cloud interface. It connects to cloud-based apps, and it saves data not to the PC’s internal hard drive, but instead to a storage drive somewhere in the cloud.
IT managers may prefer this method because it adds another degree of security. Keeping apps and data in the cloud instead of on a local drive helps to prevent ransomware attacks and obviates lengthy security updates.
Sure, cloud providers can be hit by ransomware too, but their security measures are more robust that most. Cloud providers are also careful to keep applications current and fully patched.
Chrome’s method also makes it easy to switch and share hardware. With all apps and data in the cloud, you simply grab the nearest Chromebook, login, and get to work.
Chromebooks for the C-suite
Are your customers ready to do business with Google? If so, they have lots of choices. Most major OEMs — including Lenovo, Dell, Acer and ASUS — offer at least one Chromebook for business.
Or maybe your customer is looking for something more, the crème de la crème of Chromebooks. If premium gear is on your customer’s menu, a new HP laptop could be just what they’re looking for.
The HP Elite Dragonfly is about to graduate to generation 3 — the expected release is sometime before the end of June. With this version, the Elite Dragonfly, which normally runs on Windows 11, will offer an option for Chrome OS.
HP Elite Dragonfly G3: With optional Chrome OS
New hardware features are expected to include a haptic trackpad, fingerprint sensor and privacy screen. Under the hood, you’ll be able to configure the new Dragonfly with a 12th gen Intel Core processor, up to 512GB of SSD storage, and up to 32GB of RAM.
This will also be the first Dragonfly Chromebook to feature Intel’s vPro platform, which adds additional layers of enterprise-level security and connectivity.
Is there a catch? Yup, there’s always a catch. In this case, it’s the price tag.
HP hasn’t revealed the magic number yet, but all signs point to a price similar to that of the Gen 2 Elite Dragonfly. That second-gen PC starts at around $1,800 and, when configured with all the trimmings, rockets to over $3,300.
Chromebooks for the masses
If your customers like the idea of Chromebooks for business, but would prefer to keep hardware costs on the shady side of $1,000, you might want to introduce them to the new HP Pro c640 G2 Chromebook Enterprise.
With a retail price starting at around $850, the c640 is being advertised as the world’s thinnest 14-inch Chromebook. It’s just 0.65 inch (16.5 mm) thick.
And he Pro c640 G2’s relatively low weight — a little over 3.2 pounds — makes it ideal for workers on the go in this new post-pandemic, hybrid-work world.
HP Pro c640 G2: World’s thinnest 14-in. Chromebook
Top-level specs include an 11th gen Intel Core i5 processor, 128GB of SSD storage, and 8GB of DDR4 memory. Unfortunately, the latter is soldered to the motherboard, which means it can’t be upgraded or replaced.
The c640 has a decent array of ports, including two USB-C ports and an HDMI connection. And the 14-inch display offers FHD visuals with a maximum resolution of 1920 x 1080.
But don’t expect to look like a supermodel on a Zoom chat. The webcam supports a max resolution of only 720p, a downright anachronistic spec in a world where so many of us are “zooming” to work.
RC Cola ain’t so bad
It’s good to have choices. Sure, Windows still enjoys a market share of over 90%. And, yes, Mac is still the go-to creative platform. But both Microsoft and Apple are creating products based on the same software and hardware models in use since the dawn of the computer age.
Google, on the other hand, is working hard to offer the world a smart, new way to work, learn and play.
Is Chrome Enterprise right for your customer’s business? The best way to find out is to consider all the options, choose the right hardware and give Google’s cloud-based platform a try. Who knows, your customers just might like doing business in the cloud.
Welcome to the 50th year of Ethernet. That’s right, we’ve been sending packets over cables between devices for half a century now.
Along the way, Ethernet has enabled a technological revolution. Its reliable, ever increasing high-speed communication has spurred growth in myriad sectors, including virtualization, high-performance computing (HPC), data mining and artificial intelligence (AI).
“Ethernet is a fabric of our lives and reaches everybody on a minute-by-minute basis,” says Gary “Gigabit” Gumanow, sales enablement manager for Ethernet at Intel.
He adds: “While it’s coming up on 50 years since the first Ethernet packets traversed a wire, it’s been 40 years since Xerox, Digital Equipment Corp. and Intel got together to drive the publishing of the IEEE standard for Ethernet.”
Ethernet’s CAT-5 cables with RJ-45 connectors
How right he is. Whether you’re managing an enterprise data center or sipping a Wi-Fi 6 signal in a coffee shop, Ethernet tech lurks behind the scenes, receiving and sending data at a breakneck pace.
5 virtues of Ethernet
Gary gave us 5 top reasons why Ethernet has been such a high-value tech enabler for more than 50 years:
> Abundant bandwidth: Over time, Ethernet bandwidth has increased logarithmically. Having started in 1973 with a bandwidth of just 2.9 megabits per second (Mbps), today’s Ethernet can stream data packets at an astounding 800 gigabits per second (Gbps).
> Backward compatibility: New technology often lays waste to earlier investments. Not so with Ethernet. Its new iterations connect seamlessly with existing switches, routers and other networking gear.
> Evolving Use Cases: Ethernet carries the kind of signal that can travel clear across the country with Carrier Grade Ethernet, storage, HPC, Wi-Fi for ubiquitous client connectivity, high-speed server connectivity, and bridging switches together in the core of our networks. There’s no need to buy specialized equipment or switch packet formats along the way… it’s Ethernet!
> Commitment to openness: After being ratified by the IEEE in 1983, Ethernet began new life as a global open standard. Today, nearly 40 years later, Ethernet continues to evolve. This is due in large part to its openness and accessibility, both of which encourage enterprises that otherwise compete to develop on a common platform.
> Interoperability: With the standards comes interoperability! But that isn’t always guaranteed. With Ethernet comes thousands of vendors working tirelessly to ensure that products work seamlessly for customers. The tech community has worked incredibly hard to ensure that when a customer installs a server from vendor A, with network adapter from vendor B, and switch from vendor C, that they all work together.
> Fierce competition: Because of its history as a reliable open standard, Ethernet has had a positive effect on the marketplace. It drives down costs, attracts venture capital and encourages healthy competition.
“There’s a saying in the tech world,” Gary says. “Nobody bets against Ethernet.”
The next 50 years
Ethernet is here to stay, at least for the foreseeable future. Sure, it’s 50 years old. And, yes, when calculated in “tech years,” that’s practically an ice age.
But you gotta hand it to Ethernet co-inventors David Boggs and Bob Metcalfe. They built a platform with almost unheard-of longevity.
Boggs passed away recently at the age of 71. But there’s every reason to believe his dream of an open-standard, high-speed data transfer protocol will live on for another half-century.
So what does the future hold for our beloved networking mainstay? According to Gary Gigabit, more of the same… innovation and faster.
Raising the bus bar
However, there’s a catch. The current iteration of Ethernet has once again surpassed the capabilities of the PCI Express (PCIe) devices, with which it often interfaces.
“Ethernet has transitioned from 10 Gbps to 800 Gbps at the server,” Gary explains. “But the current PCIe 4 can only handle 200 Gbps. To go faster, we’ll need Gen 5 PCIe buses.”
If faster speeds lead to greater possibilities and spiraling innovation, does Gary see great things for the near future of ethernet-based communication?
In a word, yes.
“There are so many possibilities for Ethernet to come,” says the Intel engineer. “AI, at-home CAT scans, communications with Mars — the sky’s the limit.”
Virtual Reality isn’t just for gaming anymore.
In fact, VR is headed for its next evolution. As VR works its way deeper into our lives, we’ll see sales of this relatively new tech branch off into verticals that include education, medicine and manufacturing.
There’s a tech provider opportunity in the offing — and it could be big.
The change is already happening. Last year the global VR market was worth nearly $5 billion. But only $1.4 billion of that, or less than 30%, came from gaming, according to Statista. The rest came from other consumer and business applications.
Looking ahead, Statista predicts the overall VR market will be worth more than $12 billion by 2024. That’s an increase over last year of 120% — a good sign that now may be the right time to get a foothold in the VR market.
VR goes to school
Remember those long hours in history class, watching the clock as the teacher prattled on about ancient Egypt? Sometimes those hours felt like centuries.
What if you could instead don a pair of VR goggles and take a virtual walk through the labyrinthine catacombs deep in the pyramids of Giza? Imagine how engaging, how enthralling the lesson might become.
That’s why educators have been testing VR in classrooms for years. Now brands like Lenovo are taking notice.
Lenovo’s VR Classroom 2 system is designed specifically for K-12 classrooms. It features Lenovo’s Mirage VR S3 goggles and a propriety software bundle designed for both teachers and students.
Lenovo Mirage VR S3: The teacher’s newest pet
That’s the plan for the future of education. By giving students virtual access to their world instead of dusty old textbooks, teachers hope to better engage kids and inspire them to follow their dreams.
Designing a new reality
Another area ripe for VR change is design and engineering.
Architects and designers can use software like 3DS Max by Autodesk to pre-visualize new buildings and homes. Like many design software platforms, 3DS Max lets users connect with popular VR gear like Meta’s Oculus Quest 2 goggles.
Architecting with VR: Blueprints not required
(Photo: The Wild)
Once a design is complete, an architect’s clients can use VR for a virtual walk-through of the space. This gives them a chance to voice their approval (or dissent) before expensive construction begins.
This same approach could hold for other design-related verticals. VR will work its way deeper into areas including vehicle manufacturing, renovations and civil engineering.
As those markets become accustomed to the facility VR can offer, your customers may call on you to provide VR hardware, peripherals and support. Will you be ready?
Virtual medical training
Healthcare is another vertical market with myriad uses for VR. Developers that focus on healthcare are pushing the envelope to create new VR solutions for surgical training, assessment and therapy.
VR systems can be used to train medical students, enabling them to virtually experience emergencies, procedures and other treatments. One big advantage of learning virtually: No real-life consequences if a student makes a mistake.
Similarly, practicing doctors and nurses can learn new treatments with VR. And surgeons can use VR to practice new, complex techniques before operating on live patients.
One company, OSSOvr, offers a VR surgical training and assessment platform that features 120 training modules. It’s used by healthcare providers including Marshall University and Hospital for Special Surgery.
To be sure, the software that healthcare VR makers provide is highly specialized. But it’s nonetheless designed to be compatible with popular VR hardware from brands including Meta and HTC.
Surgical training with VR: This won’t hurt a bit
(Photo: VR Fitness Insider)
The rest of us can benefit from virtual medicine, too. Our doctors can use VR to visualize our unique physiology. Through this process, we can better understand our bodies — and learn to take better care of them.
Don’t skimp on your research
Thinking of jumping into the VR game? It’s always a good idea to do some homework before pulling the trigger.
As VR tech proliferates, the market will become saturated with new products — some good, some really quite terrible.
Leveraging your precious cash flow to create a great VR portfolio could be a wise decision. But only if you make wise choices when shopping for hardware, software and services.
Are you ready to step into the future? Then pull those goggles down over your eyes and take a look at the new world of virtual reality. You might just love what you see.
Own a 5G smartphone? Plan to fly soon? If so, you could be affected by the dispute between 5G cell service and air travel.
When it comes to 5G vs. aircraft, the most likely scenario is mainly a matter of inconvenience — you’ll experience either a delayed flight or delayed data. Both can be annoying, but neither is life-threatening.
C is for “Culprit”
This whole debacle comes down to an argument over what’s known as C-band radio frequencies — specifically, those between 3.7 and 3.98 GHz. These frequencies are now used by some 5G cellular towers.
The issue is, the C-band is too close for comfort to the frequencies used by aircraft altimeters. These important devices, used in most modern aircraft, operate primarily in the slightly higher 4.2 to 4.4 GHz range. But there’s enough crossover with the nearby C-range to worry the U.S. Federal Aviation Administration (FAA).
The FAA has been very clear: Don’t mess with radio altimeters. These devices feed as many as 17 vital aircraft systems, including the engines, terrain avoidance, collision avoidance and speed brakes.
Altimeter (upper right) of a Boeing 737 jet
Perhaps most important, radio altimeters help pilots land their heavy aircraft when visibility is low, such as during pelting rain, thick fog or heavy snow. That’s also when an aircraft is close enough to the ground for the signal from a terrestrial 5G antenna to cause interference.
Need for speed
Okay, but couldn’t we just go about our lives without 5G service operating in the C-band? Sure we could. But we don’t want to.
That’s because C-band (aka mid-band) is the sweet spot. It sits right in the middle of the spectrum, offering the best combination of speed and coverage.
The low-, mid- and high-band of the 5G spectrum
By comparison, low-band frequencies (600 to 850 MHz) offer better dispersion over large geographical areas. But the data down there can be thin on the ground. In fact, low-band data streaming speeds are not much faster than those of older 4G.
At the other end of the spectrum is the high-band (24 to 47 GHz), also known as the millimeter-wave frequency. Here, data travels so fast, it feels like you’re streaming over a Wi-Fi connection. With a solid millimeter wave connection, you can download HD movies in seconds and play the latest AAA games without latency or drop-outs.
But millimeter-wave has an Achilles heel: It cannot travel over long distances or through walls. Until these issues are overcome by wireless companies such as Verizon and AT&T, high-band streaming in all but the most favorable of physical locations will remain a dream of the near future.
For now, C-band offers the best of both worlds. On the one hand, you’ve got an operating area almost as wide as low-band. On the other, you get data speeds that, while not as fast as high-band, still stream up to 10 times faster than older 4G.
Just a Band-Aid?
The industry’s main response to the C-band dilemma has been to strike an uneasy compromise. The FAA is now approving the use of various radio altimeters — but only when airlines and manufacturers can prove the device is all but infallible when operated near a C-band-using 5G cell tower.
Also, the FAA’s approval process is slow. While many radio altimeter-equipped aircraft have been officially cleared for a safe landing, there have also been many exceptions.
To date, about 20% of the U.S. commercial fleet is still sitting in hangars due to the altimeter/C-band issue. Of particular note is the grounding of the country’s most prolific wide-body, long-haul jet, the Boeing 777.
Wireless carriers are helping, too. They’ve agreed to keep their C-band-emitting cell towers at least 2 miles away from most major U.S. airport runways, at least until a better resolution comes along.
It’s not yet clear how the 5G story will end. But chances are we’ll find out soon.
Organizations on both sides of this issue include some of the country’s wealthiest, most powerful companies. They want this issue resolved, and they have the resources to make it happen.
On one side is the beleaguered aviation industry. It’s had a rough time since the onset of the global pandemic. With thousands of flights canceled daily during the worst days of the pandemic, the airlines, aircraft manufacturers and pilots’ unions are looking for a way to stop the red ink from bleeding them dry.
On the other side are the wireless carriers — Verizon, AT&T, et al. They’ve already spent well over $81 million buying the rights to use certain parts of the radio spectrum, including C-band frequencies. Now the carriers want to transform that investment into a lucrative return.
Apple iPhone 13: Full 5G C-band connectivity? Yes, please
But don’t forget hardware designers like Apple and Samsung. They’ve been shipping 5G handsets since 2019. Last year, phone makers shipped nearly 90 million 5G devices in the U.S. alone.
All that boils down to a group of highly motivated multibillionaires on the warpath to resolution.
When and how the 5G issue gets resolved is anyone’s guess. For now, the best thing to do before traveling is to draw up a contingency plan in case your flight gets canceled.
Or, ya know, take the train.
Are you and your customers ready for a smart, trendy upgrade? Then get ready to kick it — er, type it — old school.
A new generation of mechanical keyboards is here. It offers improved ergonomics and modern convenience, all while retaining that classic clickity-clack.
For the uninitiated, the value of a mechanical keyboard may not be readily apparent. After all, these keyboards are bigger, louder and more expensive than your average typing surface.
Are they worth it? That depends on what you plan to do once you lay your fingertips on those keys.
Avid gamers swear by mechanical keys. They love how these keyboards offer the perfect amount of travel, resistance and audible click. That combo can provide both a great typing experience and true competitive advantage.
What really distinguishes these keyboards is their use of mechanical switches. Unlike the membrane-switch keys commonly found on laptops, mechanical switches — such as the famed Cherry MX series — require a little extra down-force. This helps to reduce accidental keystrokes that could ruin a game.
Mechanical switches are also designed to return to their up positions very quickly. This greatly accelerates double-taps and other complex gaming maneuvers. Given the rapid-fire pace of today’s games, that’s huge.
There are quite a few mechanical switches out there. Each deviation offers a slightly different combination of sound, motion, resistance and feel.
How to find the perfect fit? It’s usually a matter of reviewing the various options, then employing some good old-fashioned trial and error. One helpful starting place is this Keyboard Co. guide to Cherry MX switches.
Once you’ve settled on your ideal switch, it’s time to choose the rest of the keyboard. Looking for a decidedly premium experience? Then consider the Drop CTRL Mechanical Keyboard.
Drop CTRL: An uncommonly customizable keyboard
Starting at $200, Drop’s keyboard certainly isn’t cheap. But it’s well built and uncommonly customizable. You and your customers might find it’s well worth the extra money.
The CTRL features a sturdy aluminum frame, adjustable feet and dual USB-C connectors. And each key has its own custom key-mapping and RGB lighting.
Mechanicals for the masses
But mechanical keyboards are not only for gamers. A classic keyboard design offers far more than just advanced sword-play and laser fire.
All-day typists can use these keyboards, too. For them, a well-designed mechanical keyboard can provide comfort, longevity and better ergonomics than your average typing surface.
Need proof? Look no further than that mass-market peripheral powerhouse, Logitech. The company’s latest mechanical wonder, the G915 TKL, is a modern and chic home-office take on the classic mechanical keyboard.
Logitech G915 TKL: gaming or working, your choice
Sure, Logitech puts the word “gaming” in the keyboard’s marketing copy. But that’s just to ensure we take this mechanical monster seriously. Look deeper at the G915’s construction, and you’ll see that the it’s been designed for work and play in equal measure.
The giveaway is the keys themselves. A bona fide gaming board would offer either the aforementioned Cherry MX switches or those of a close competitor with gaming cred such as Helio.
Instead, the G915 offers a choice of three different mechanical switches, all made by Logitech itself. These switches, like most mechanical keys, are differentiated using the universal color scheme originally defined by Cherry.
But wait, there’s more. The Logitech G915 also displays its office-centric focus with premium bells and whistles. Of particular note is the keyboard’s low-profile design, integrated media controls, and one-button switching between wireless and Bluetooth devices.
Okay, it’s not cheap. The Logitech G915 TKL retails for around $190.
The art of the upsell
At this point, you may be wondering whether mechanical keyboards offer the channel an opportunity. Here’s one to consider.
Computer sales tend to catalyze peripheral sales. Too often, this means sourcing the cheapest compatible keyboards available.
But what if you instead offered your customer a high-margin peripheral? One that looks good, feels great and will work well for years to come? (Mechanical keyboards are particularly durable.)
Bottom line: Mechanical keyboards offer a fun and uniquely personalized typing experience. Whether your customers are heavy-duty gamers or all-day office workers, a mechanical keyboard might just be love at first touch.
Hybrid work is the new normal. For the foreseeable future, hardly anyone’s going back to the office Monday through Friday, 9 to 5.
Instead, workers are now found in offices, at home, on park benches, in airplane seats, and just about anywhere else that offers a flat surface and a decent internet connection.
However, working from these various locations is still harder than it should be. Mainly, that’s due to a decided lack of device interoperability.
Major platforms still incompatible
You might think that by now, all the major platforms — Windows, macOS, Android, iOS, Chrome et al. — would be fully interoperable.
But just try sharing a file in a proprietary format with another platform, and you’ll see how wrong you are. An iPhone won’t play nice with a Windows 11 ultrabook. That Android tablet prefers not to shake hands with nearby PC monitors. Your smart TV doesn’t speak the same language as your in-car display.
The answer? Big Tech thinks it’s cross-device setups. If they’re right, this could be the perfect solution for the modern workforce. And a new business opportunity for tech providers.
Bridging the gap between devices
How do cross-device setups help the modern worker? The answer may be found in Intel’s recent acquisition of Israeli tech startup Screenovate.
Founded in 2009, Screenovate is working to commercialize cross-platform software. Essentially, the company’s software duplicates and displays the contents of a mobile device on the screens of other devices, such as PCs, smart TVs and 2-in-1 convertibles.
Intel’s purchase of Screenovate is a prime example of how Big Tech is working to bridge the gap among the many devices we interact with every day.
Intel isn’t alone. Other tech titans, including HP and Microsoft, are getting into the game, too. Their goal: give workers a smooth transition between all their devices. That includes mobile phones and tablets, desktop and laptop PCs, and secondary devices such as smart displays.
Microsoft’s human-centric approach
Connecting devices is the purpose behind Microsoft Graph. It’s a new API-based platform designed to replace workers’ device-centric activities with what Microsoft calls a human-centric approach.
When workers move from one device to another, they can lose data, time and patience. Microsoft says Graph can help by enabling software designers to build-in a menu of recent activities.
Then the contents of this menu are stored in the cloud. From there, they’re pretty much available from any connected device.
Microsoft Graph: sharing activities with any supporting device
Using Microsoft Graph-enabled devices, your customer’s typical workday morning might go something like this:
7 a.m.: Your customer wakes up. Looking at their bedside smartphone, they see a display of yesterday’s collaborative project.
7:30 a.m.: While eating breakfast, your customer has the latest edits to the project read aloud by a smart speaker.
8:30 a.m.: Your customer drives to work. On the way, they have the car’s dashboard screen host a chat or conference call and display the PowerPoint presentation controlled by a remote worker.
9 a.m.: Your customer is now seated at a traditional office desk. Now they can click a button to call onto their work PC all the latest data from that productive morning commute.
Or, if your customer is working from home that day, all these devices (excepting the car’s dashboard) can similarly share data, files and more.
New, cross-device business for tech providers
Will the coming cross-device revolution bring new sales opportunities? It’s a pretty good bet.
New hybrid work-focused devices like the recently announced Dragonfly ultrabooks and E-series conferencing monitors from HP will soon hit the market.
The HP Dragonfly features a videoconferencing enhancement called HP Presence. The device also offers myHP, an application that gives users a dashboard to control and customize their PC experience.
HP Dragonfly: virtual meetings built-in
Also, a new generation of devices enabled by the Intel/Screenovate screen-mirroring tech could kick off a mobile-tech buying spree.
Software will likely be a major player, too. New versions of faithful work staples like Word and Excel will gain new Microsoft Graph superpowers. To take advantage, your customers will need to update older software titles and perhaps buy new apps too.
The new normal of hybrid work will demand cross-device compatibility as a matter of course. Now’s the time to get the ball rolling — and that ball can now roll among your myriad devices.
The graphics processing unit (GPU) is tech’s ultimate nerd. For many years, it lurked in the shadow of its far sexier cousin, the central processing unit (CPU).
Now they’re reversing roles. Today it’s the GPU that’s powering sexy new tech like AI and machine learning. And the CPU that’s handling boring old text documents, spreadsheets and other mundane chores.
No, this isn’t the end of the CPU. But with Moore’s Law winding down and a mixed-reality revolution on the way, it’s now the GPU’s time to shine.
GPU: a brief history
The history of the GPU dates back only a little over 20 years. It was 1999 when NVIDIA introduced the industry’s first widely available GPU, the GeForce 256.
Back then it was all about graphics (hence the name). The best engineers, seeing the ascendance of the graphical user interface (GUI), surmised that computers would soon need much more power to process graphics.
NVIDIA’s 1st GPU: GeForce 256 (1998)
They were right. And over time, GPUs have gotten faster and smarter, and graphics processing has become a powerful enabler. Better, faster and more powerful GPUs have let video-game designers offer hyper-realistic detail, super-fast frame rates and better action.
Multimedia creators fell in love with the GPU, too. New generations of 2D, 3D and video artists have used multiple GPU arrays to create modern art exhibits, animated films, and countless independent movies that now stream on Netflix and other popular services.
GPU vs. CPU: What’s the difference?
Though GPUs and CPUs are both processors, they operate differently.
The CPU, using a technique called serial processing, tackles a wide range of tasks with brute force and lighting speed. It’s called “serial” because the CPU processes one operation at a time, moving on to the next operation only after it has completed the previous one.
Also, a CPU has relatively few cores, typically no more than a dozen or so. However, each CPU core uses a large cache and broad instruction set. That’s how a CPU can manage a computer’s every input and output.
CPU vs. GPU: more cores, more simultaneous operations
GPUs use way more cores — literally hundreds, even thousands of them. For an extreme example, one of NVIDIA’s current GPUs, the GeForce RTX 3070 Ti, contains 6,144 cores.
This lets the GPU process thousands of operations simultaneously, a technique known as parallel processing. In this way, the GPU and its associated software can take a big workload — for example, rendering a 3D graphic — and divide it into smaller ones.
Then the GPU can process all those smaller pieces at once, completing the total workload in much less time than a CPU could.
Other advanced technologies can further enhance GPU performance. Intel’s Arc graphics processors, now shipping, offer hardware-accelerated Ray Tracing, Xe Super Sampling (XeSS) AI-driven upscaling technology, and Intel Deep Link technology.
Back to GPU future?
While today’s GPUs are cutting-edge, the parallel processing approach they use dates back to the 1970s. That’s when Seymour Cray and his colleagues built the first gigantic supercomputers, the Cray-1 and its successors.
Cray-1 supercomputer: parallel processing circa 1975
Back in the 1970s, artificial intelligence was just a glimmer in the eyes of sci-fi writers and comic book artists. But whether Seymour Cray knew it or not, his legacy — and that of his fellow engineers — is being realized as we speak.
Bleeding-edge technologies such as AI, machine learning and mixed reality are now being powered by the parallel processing of the latest GPUs.
For example, when you ask Alexa whether it’s going to snow tonight, it’s a GPU that parses your words, scrapes meteorological data from the web, and tells your smart speaker what to say.
Similarly, after your doctor orders up an MRI for your aching back, it’s a GPU that processes the images, recognizes the patterns, and assists with the diagnosis.
That’s today. Tomorrow, when your personal robot offers you a good movie, teaches your child to speak French, and then pours a chilled mug of your favorite beer, all that will be powered by a GPU. Talk about revenge of the nerds.
New year, new website? That’s a good idea. And with do-it-yourself website builders, creating a new website doesn’t have cost you a bundle.
The latest online website builders from Squarespace, Wix and the like make it easy to pull together a DIY website that looks like it cost a mint.
The best builders also offer drag & drop design, integrated graphic editing tools, and an array of plug-ins for added functionality.
You’ll also save money on what you don’t need. When you build a site with these DIY tools, you won’t need the help of pro coders, photographers or search-engine optimization (SEO) experts.
But buyer beware. Not all DIY website platforms are safe and reliable. Here’s your guide to getting started.
Squarespace: Flying first class
Squarespace is a premium option with a premium price tag to match — the basic package starts at $12 a month.
Adding more sophisticated features like eCommerce, code injection, and web analytics will push the monthly price up. For instance, the advanced Commerce subscription costs $40 a month.
You do, however, get what you pay for. Squarespace offers great support, beautiful templates, and flawless browser and mobile compatibility.
Squarespace: Slick interface, cool features
There’s a plug-in for that
The best cloud-based website builders extend their functionality with plug-ins and integrations.
These make it simple to add complex features to your site. That could include real-time shipping calculations, customer appointment scheduling, and accounting. Email, SEO, and social media marketing tools are usually on offer, too.
Sure, these features will help push your costs up — and up, and up. But before you pass judgment, be sure to look at the total cost of ownership (TCO). A tightly integrated system with a slick user interface could in the long run save you time and money.
When free really isn’t
How about WordPress? After all, you could argue the company started the whole DIY website revolution in the first place.
But you could also argue that WordPress gave the revolution a bad name.
As you may know, WordPress is an open-source website builder platform, and anyone can install and use it for free. But remember, you get what you pay for.
WordPress plug-ins can extend functionality
Free software might save you money upfront, but it could end up costing more over time. That’s because sites built with WordPress can end up plagued by malware. The open-source nature of the platform means there is no for-profit company looking out for the well-being of its user base.
For example, in November web-hosting company GoDaddy admitted that its managed WordPress hosting environment had been hacked. GoDaddy said the cybercrooks may have gotten their hands on the email addresses and other information of up to 1.2 million customers.
Is WordPress a lost cause? No, not at all. In the right hands, it can be a powerful tool.
In fact, WordPress’ design and functionality can rival those of more-expensive options. The question is, how much will those “right hands” cost you?
It’s much easier to get started with a website builder than it is to change from one platform to another.
Why? Because the convenience of an all-in-one platform like Squarespace often leads to virtual entrenchment that’s hard to reverse.
Before you know it, they’re hosting your domain name, website, accounting system, online payment gateway, mass marketing email lists, customer database, and who knows what else.
So take your time, and review the options. The platform you choose may be yours for a long while.