Metaverse technology may be new, but it’s surprisingly easy to find a metaverse meeting space.
A growing number of suppliers are eager to invite your hybrid workforce into the infinite space of Web3. They include not only tiny startups, but also industry giants such as Google, Meta and Microsoft.
But is today’s metaverse a good place for you and your team to work together? It really depends.
On the one hand, virtual environments offer immersive collaboration, easy access to multimedia training tools, and a feeling of togetherness. All that far surpasses the cold separation of video chats on Zoom, Teams and Meet.
Meta’s Horizon Workrooms: Welcoming both virtual avatars & video callers
On the other hand, the metaverse still exists in a Wild West status. Even the most stable, feature-rich platforms often come with high doses of confusion, glitches, tech issues—even motion sickness.
Despite those setbacks, it’s hard to imagine a future without some aspect of metaverse technology pervading our lives. If the pandemic has taught us anything, it’s that tech like this can prove invaluable.
Accenture’s bold trial
Accenture certainly thinks so. Last year, the consulting firm bought 60,000 Oculus VR headsets from Facebook parent Meta.
Accenture plans to use the VR headsets when onboarding the 150,000 new employees it hires each year. Now those newbies will spend their first days at work on what Accenture calls the “Nth Floor.” That’s a corporate metaverse developed for the firm by Microsoft.
Once there, new Accenture employees will find themselves immersed in a custom virtual environment. They’ll have access to interactive training materials, videos and various meeting spaces where they can give a virtual “hey” to their fellow hires.
Sure, Accenture employees can slide on their goggles while working from Accenture’s main campus in Florham Park, N.J. But the real charm of this system is that they can also enter the Accenture metaverse from a hotel in Helsinki, a hostel in Hong Kong, or anywhere else with a hot Wi-Fi signal.
To enter the metaverse, you’ll need a VR headset like the Oculus 2
Interested in giving metaverse meetings a try? For the price of an Oculus 2—around $400—you and your customers can get an idea of what Accenture’s virtual space looks like by checking in to Meta’s Horizon Workrooms or Microsoft Mesh. While there, you can also create a virtual space of your own.
Finding your (virtual) flow
Another one is Teamflow. At first glance, it appears to be your typical tech startup—a scrappy insurgent that prides itself on being far more nimble and forward-thinking than the big guys.
But the real proof of Teamflow’s success is in its impressive client list. This startup has already landed accounts with Dropbox, Netflix and Uber, to name just a few.
Teamflow’s ace in the hole is its thoughtful design. Plus, the company has focused on a target demographic that values ease-of-use and competitive pricing.
Teamflow lets you create a unique coworking office space
Getting started with Teamflow is simple: You just create a virtual office. Prices start at free for up to 5 people, then go as high as $25 per employee per month.
With that, you’re ready to experience what Teamflow calls “spontaneous coworking.” The idea is to virtually wander through the space, letting serendipity happen as it can in a physical office.
There are virtual water coolers, open-plan offices, nooks, crannies and private meeting rooms. All give Teamflow inhabitants the chance to seek out like-minded coworkers and start new projects.
There’s some gee-whiz tech in Teamflow, to be sure. For instance, the sounds you’ll hear are governed by the virtual distance between your avatar and those of your coworkers.
This technology, known as “spatial audio,” helps to eliminate the cross talk that all too often plagues Zoom and Teams meetings. Spatial audio creates a remarkably lifelike environment. For instance, if you want to talk to another avatar, you must first virtually walk toward them, close enough that they can hear you.
Similarly, if you want to discuss a point in private, your avatar and another can virtually walk away from a larger group. By creating this virtual distance, you ensure that the rest of the group won’t be able to hear your brief confab.
Ready for prime time?
Those who love tech in general will see the metaverse workspace as decidedly neato burrito. But the stodgy set—those who describe Zoom with a four-letter word (the bad kind)—may have a hard time getting used to meeting in the metaverse.
Chief among their complaints will be the VR goggles. For one, they’re expensive. VR goggles also are heavy enough to become uncomfortable after wearing them for 30 minutes or so.
The other big worries of the stodgy set are likely to include security and privacy. And who can blame them? With Meta behind so much of the metaverse, it’s reasonable to have a bad Facebook-flavored taste in your mouth.
Still, there’s no stopping the metaverse revolution. Hybrid working—with employees sometimes in the office, sometimes at home—is here to stay. Sooner or later, we’ll all need to interact in the metaverse.
Now’s the time to get a head start. Meet these new metaverse hardware and software platforms. You might find a virtual reality that suits you.
Overclocking is the dark art of computer customization. Like some mysterious black magic spell, it’s a much maligned, oft-misunderstood process.
But at its core, overclocking is pretty simple. When you overclock a component, you’re simply forcing it to operate faster than its manufacturer intended.
So is overclocking a good thing? Sure, it can be. But overclocking also has its downsides.
For every pro gamer expertly eking out a minute competitive advantage there's some poor sod lamenting the charred carcass of a PC that was pushed a little too far.
But hey, nothing ventured nothing gained, right?
What can be overclocked?
Technically, you can overclock any component with a clock.
You’ll find clocks — also known as “timers” — in a variety of high-level computer components. These clocks synchronize their components’ internal operations.
The most common candidate for overclocking has traditionally been the CPU. But these days, modern graphics processors (GPUs) run a very close second.
RAM and motherboard chipsets also run on a clock, and therefore they can be pushed past their intended operating speeds. But this is much less common.
Usually, it’s reserved for situations in which one of these components is causing a bottleneck. By clearing the bottleneck, you can add years to an aging machine’s useful life.
How to overclock
The dawn of overclocking saw much hacking, bricking and downright burning of overclocked components. But these days, it’s as simple as clicking a button in the BIOS.
The key is starting with a component that’s been designed for hot-rodding. That’s gotten pretty easy.
Responding to the growing popularity of overclocking, chip designers including Intel have seen fit to release specially designed processors. In Intel’s case, these processors are designated with either an X (for Extreme Edition) or a K (for K-series).
Intel Extreme Edition CPU: unlocked & ready to overclock
These chips, unlike their utilitarian counterparts, are “unlocked.” In other words, they can be safely overclocked to your heart’s delight.
Doing so simply necessitates a quick trip to the BIOS before the operating system takes control. With the click of a button and the drag of a slider, you can choose to overclock a little or a lot. It really is that simple.
But before you rush to overclock, heed a word of warning. That word is “heat.”
Every computer component produces some heat. The harder you push the component, the more heat it produces.
Nowhere is this more apt than with modern, high-performance CPUs and GPUs. Your run-of-mill desktop CPU tends to idle somewhere around 30 C (86 F). Under a full load, it will push up into the neighborhood of 55 C (131 F).
Sure, that’s pretty hot. But it’s nothing a decent copper heatsink can’t handle. Add a well-designed fan, and it’s cool sailing for sure.
However, when you overclock that CPU, you also kick up the heat factor by a few notches. Now temps can reach 100 C (212 F) — hot enough to boil water.
As you probably know, heat is the enemy of computer systems. So an exponential increase in heat can translate into an exponential increase in the possibility of a system failure.
What’s the solution? The answer is easy to fathom, but not so easy to implement.
The easy part is concluding that the faster you run your silicon, the faster and more efficiently you need to evacuate the hot air it creates.
What’s harder is performing that minor miracle in a 14x14-inch metal box.
Pro gamers will tell you that liquid cooling offers a happy medium. For an experienced enthusiast, the installation process is small potatoes.
Cooler Master system: Liquid cooling for overclockers
Plus, the cost for a decent liquid-cooling system can be as low as $80. What’s more, modern incarnations of liquid cooling rarely if ever require a messy fluid change.
But is it worth it?
For most of your customers, probably not. Very few computer users will benefit from overclocking. In fact, it’s become a bit of an anachronism for all but the most diehard of speed demons.
The fact is, today’s CPUs, GPUs and other components are so incredibly fast and efficient, even pro gamers, research scientists and AI developers will be hard pressed to justify the time, expense and decrease in reliability that overclocking brings.
In other words, think twice before overclocking the silicon in your or your customer’s PC. Unless they’re true PC enthusiasts, their need for speed may be sated by something off the shelf.
The rapid evolution of hybrid meetings has brought with it an array of next-gen meeting tech.
From smart cameras that zoom in and focus on each speaker to digital whiteboards accessible to both in-person and remote participants, hybrid-meeting gear makes it easy to collaborate with a team no matter where in the world you happen to be.
What makes a successful hybrid meeting? The answer is twofold.
First is a willingness to try new things, overcome challenges and open new communication pathways.
Second is the right technology. You knew that was coming, right?
Sweet sounds of success
Bad sound can bring a hybrid meeting to its knees. You can usually get away with spotty video. But when people can’t hear each other, it’s pretty much game over.
No surprise, one of the best hybrid-meeting speakers is made by famed audio designer Sennheiser. Previously best known for its pro audio headphones and mics, Sennheiser has entered the hybrid-meeting space to great effect.
Sennheiser’s nearly $500 TeamConnect Intelligent Speaker boasts above-average sound. But it takes more than that to win an iF Design Award and earn Microsoft Teams certification.
Sennheiser TeamConnect: one smart speaker
One of the speaker’s most valuable features is the ability to distinguish among 10 different voices. It does this either in the room via 7 beam-forming microphones or by remote audio input.
After learning the individual voices of each participant, TeamConnect then transcribes every word and creates a running transcript that looks like a Slack chat. The transcription is stored securely on a remote server for easy reference at any time.
Once the primo sound is taken care of, the next step is a smarter-than-your-average-bear video system.
Enter the UVC84 by Yealink. It’s a Microsoft Teams-certified smart camera designed for medium and large rooms. This camera mounts on either the ceiling or a wall.
Yealink UVC84: It knows who’s speaking
You could be forgiven for thinking the UVC84’s most valuable features are its 4K resolution, 12x optical zoom, 80-degree field of view, and software-controlled mechanical pan and tilt. But you’d be wrong.
In fact, the UVC84’s best bell/whistle is its ability to automatically focus and zoom in on whichever attendee has the floor at any given time. Even in a room chock full of participants, the camera seeks out the current speaker. It also fills the screens of all remote participants with the speaker’s head and shoulders.
That kind of artificial intelligence doesn’t come cheap. The UVC84 will set you back around $1,000.
Don’t call it a whiteboard
Let’s get one thing out of the way: Microsoft’s Surface Hub 2S costs about $9,000.
Once the sticker shock subsides, it’s time to revel in the majesty of what must be one of the most interesting collaborative technologies around.
Microsoft Surface Hub 2S: Way more than a whiteboard
Yes, the 2S is a digital whiteboard. But it’s also much, much more than that.
Its standout features include an 85-inch, high-res touchscreen; 8th gen Intel Core i5 quad-core processor; integrated camera; and Doppler occupancy sensor. (Okay, that last one is a little bit creepy. But still.)
At this point, you may be thinking that Surface Hub 2S is basically a giant all-in-one PC. Technically that’s true—except for the way you interact with it.
This behemoth interactive screen is designed to either mount on a wall or perch on an easel. Place it at the front of a meeting room, and you’ll see people walk up to it as it becomes their turn to present. Not exactly your run-of-the-mill Windows machine.
What’s even more interesting is how the Surface Hub 2S enables remote workers to easily collaborate in a common virtual space. Any participant connected through the Microsoft Teams session can take control of the screen to share and edit ideas.
Those lucky/unlucky enough to be in the office will see screen elements move in real-time as remote workers put in their two cents. The same is true for those at home with their feet up on the couch.
There’s so much hybrid meeting tech around—and so much more on the way. The key to a good solution lies in choosing the tech that makes the most sense for you and your customers. And as the prices above show, that includes your customer’s budget.
But one thing’s for sure: Hybrid meetings are here to stay.
Pandemic or not, workers are going to be phoning it in for years to come. And for tech providers like you, that turns hybrid-meeting tech into growing opportunity.
Maybe you should have a meeting about that.
Gaze at a mini-PC long enough, and you may wonder how something so small can contain so much computing power.
After all, that turtle-shell-sized box doesn’t occupy more than a tiny sliver of your desk, doesn’t make much noise, doesn’t kick off much heat, and doesn’t bear much resemblance to the hulking tower it replaced.
Yet every time you say “jump,” it asks “how high?” Opening a dozen browser tabs won’t slow it down. Neither will a PowerPoint chock full of high-res graphics and gee-whiz animations.
In fact, you can ask that little fella to jump through just about any hoop and, assuming it’s configured with enough horsepower, it’ll just go ahead and leap through.
Intel’s most popular NUC Mini PC: just 4x4 inches
But how? How does a mini-PC run circles around the gargantuan steel towers of yesteryear? Perhaps more to the point, how does it complete those circles without breaking a sweat, much less bursting into flames?
Get cool, boy
That’s an important question. Central processors produce heat—quite a lot of heat, in fact. So do GPUs, memory modules and SSDs.
In fact, every component inside a computer adds to the overall ambient temperature. If and when that temperature gets too high, the components will throttle themselves back for safety. But if the computer heats up beyond a safe zone, they will eventually shut down.
That’s why, when Intel engineers set out to design the first 4 x 4-inch NUC Mini PC, they had to solve a big problem in a small space.
Early on, said engineers discovered that small cooling fans make a big noise. These fans also suck in dirt and debris, and that can wreak havoc in the confines of a chassis.
Replacing the fans with heat sinks was the obvious answer. But a passive cooling system is only effective if the internal ambient temperature stays below a certain level.
To keep things on the cool side, the Intel NUC includes an impossibly thin processor. With a lithography of just 7 nanometers, the Core i7-12700 operates at speeds up to 4.9GHz with a 25MB cache, yet boasts a thermal design power (TDP) of only 65 watts.
Simply NUC: Heat sinks outside the chassis, too
Combine that relatively low thermal signature with intelligent throttling, smart chassis design and an external power supply unit (PSU), and you’ve got a recipe for high productivity with low heat.
Why? For one, in most mini-PCs, the central processor is soldered to its motherboard. This setup is a common feature of laptops, mobile devices and smart consumer electronics. This approach makes the device more durable and resilient, and it’s also cheaper to manufacture, helping to keep retail prices low.
The GPU of your average mini PC is handled the same way. Either it’s integrated with the CPU or it’s offered as a discrete graphics processor that’s nevertheless inexorably attached to the motherboard.
Is that the end of the world? No, not by a long shot. A large percentage of mini-PCs are used for workloads that don’t require expansion. These jobs include videoconferencing, digital signage, restaurant self-order kiosks, even running the smart menu board at your local McDonald’s.
Bottom line: The lack of expansion possibilities is not usually a problem — unless, that is, you’re a gamer.
Enter Intel’s NUC Extreme Mini PC Kit. It’s housed in a small 8L chassis that measures just 4.7 x 7.8 x 14 inches. While that’s demonstrably less mini than its tiny siblings, it’s still about one fourth the size of a traditional 30L gaming tower PC.
Intel NUC Extreme: for gamers & creators who rely on expansion
Intel’s kit delivers the bare essentials: a chassis, motherboard, CPU, power-supply unit, Wi-Fi and an I/O set including Thunderbolt, HDMI and USB ports. The rest is up to you.
The rest can include up to a trio of M.2 SSDs, up to 64GB of DDR4 RAM, and a Gen 5 PCI Express discrete gaming graphics card (as long as it’s no more than 12 inches long).
What’s more, the Intel 10nm Core i7 or i9 processor included with the kit can be replaced with any Intel CPU, as long as it’s compatible with the LGA1700 socket and 600-series chipset.
Built to order
A fact mostly unknown to consumers yet nevertheless beloved by tech providers is that Intel has designed its NUC devices as a flexible platform for resellers.
NUC, which stands for Next Unit of Computing, exists as both an out-of-the-box solution and a have-it-your-own-way platform, replete with myriad components and custom mods.
This makes it possible for resellers like Simply NUC and France’s Bleu Jour to concoct solutions that make the most sense for their customers. They can also create those solutions while remaining mostly unhindered by supply-chain issues and mounting R&D costs.
Could Intel’s platform work for you and your customers, too? The answer may be (sorry!) NUC-ing at your door.
Is the modern eReader a technological anachronism — or the best-kept secret of diehard bibliophiles? It depends on who you ask.
Fans of cutting-edge tech will likely have little use for a dedicated eReader. Who needs a single-purpose, monochrome device?
After all, they’ll crow, my iPad handles eBooks just fine! And I can check my Instagram between chapters without changing devices.
But our diehard bibliophile will sing a different song. When asked to extol the virtues of eReaders, they’ll point first and foremost to the e-ink display.
E-ink, also known as electronic paper, enables your average Kindle to provide a decidedly book-like reading experience. Like a printed book — and very much unlike a standard tablet — it’s easy to read in bright sunlight.
Kindle fans also enjoy the lack of features in a dedicated eReader. They’ll tell you it eliminates distractions and removes the temptation to check your email one last time.
All about the screen
At its core, an eReader is a tablet. It’s got a touch interface, Wi-Fi connectivity, a processor and some memory.
But unlike your average tablet, eReaders present content via e-ink. This unique screen technology requires very little power, mainly because it doesn’t produce any light by itself. The overall impression is that you’re reading from a printed page.
Amazon’s latest Kindle Oasis, for example, uses what’s known as electrophoretic e-ink to produce high-resolution monochrome text and images.
The Kindle’s screen is made up of 2 conductive plates separated by a gap of 10 to 100 micrometers (1,000 micrometers equals about 0.4 inch). Between those plates sits a layer of dark-colored oil. And floating in that oil is a mass of titanium dioxide particles, each just 1 micrometer in diameter.
eReader under glass: The pixels have it
To produce text and images, the Kindle applies positive and negative charges across the plates. A positive charge brings negatively charged particles to the surface — and vice versa.
Particles drawn to the top plate appear white. And particles drawn to the bottom plate appear black. The shapes created by these so-called electrophoretic migrations appear on the screen as text and pictures.
By way of comparison, the image on your iPad is produced by millions of light-emitting diodes (LEDs). Each LED is given an electrical charge that produces a brief flash of light.
Those millions of LED flashes produce a continuous glow. Their position and color are defined by onboard graphics software and hardware. The result is the kind of stunning, high-res graphics we take for granted.
eReaders: still relevant?
Sure, e-ink may be yesterday’s news. It was first produced by Xerox in 1970. But it’s not without its virtues.
To start, e-ink doesn’t produce any light. As mentioned above, that makes it easy on the eyes and easy to see, even in bright sunlight.
By comparison, an LED screen constantly bombards you with potentially harmful blue light. Studies show this can lead to both mild issues like fatigue and more serious issues like macular degeneration.
Also, eReaders boast an incredibly long battery life. Amazon’s Kindle Oasis needs to be charged only about once every 6 weeks.
Amazon’s flagship e-reader, the Kindle Oasis
That’s thanks in part to its limited functionality. A Kindle doesn’t need the latest power-hungry M1 or Snapdragon processor. Nor does it require much RAM.
What’s more, an e-ink screen requires power only when it changes states. After a new page is loaded, the “ink” stays on the screen with no help from the device.
Features? Who needs features?
So is the eReader’s lack of features actually a virtue? A compelling argument could be made for its less-is-more approach. There’s something to be said for putting aside your devices — and the chaos that comes with them.
Getting lost in a great book is one of the splendors of life. It gives us a chance to forget our worries and become deliciously lost in a creative cosmos. The effect can reduce our heart rates, create new neural pathways and imbue us with invaluable empathy.
So, the next time you’re in the mood to plow through "War and Peace," cast a spell with Harry Potter, or invite a bodice-ripper to tickle your fancy, try reaching for a Kindle instead of an iPad.
In the digital pages of this one-trick pony, you might discover a brave new world.
Podcasts continue to be big business—and so does the tech used to create them.
Every day the movement grows as folks from all walks of life launch their first podcast episodes. And every day it seems a new piece of gear hits the market, promising easy recording, editing and publishing.
Maybe your customers are looking for a superior podcasting solution. Or maybe you’re ready to plant your flag on iTunes in hopes of drawing in some new clientele. Either way, it’s worth checking out some of the latest and greatest tech for audio and video podcasting.
Straight to the source
Spotify is now the second most popular destination for podcasts, boasting 3.6 million podcasts and 389 million listeners across 187 markets.
At some point, someone on the Spotify team figured, hey, let’s cut out the middleman and just offer a way to produce and publish episodes right on the platform. Thus Spotify for Podcasters was born.
In case you didn’t know, Spotify already has a web-based audio editing suite called Soundtrap. The basic version is free. More sophisticated tools like pitch correction, loops and effects are available starting at around $8 a month.
Soundtrap: Spotify's free way
Using the Soundtrap interface, you can record, edit and publish right to Spotify. No surprise, right?
Spotify’s podcasting apparatus also includes analytics to help target your audience and discovery tools to help new podcasters get noticed.
The old-fashioned way
For those podcasters who prefer a more traditional approach to audio production, popular audio tech brand PreSonus offers a purpose-built box with some very interesting features.
The company’s Revelator io24, which retails for about $200, is a solid USB audio interface that includes a loopback mixer. This lets you add backing tracks while recording or patch in a Zoom caller for remote interviews.
PreSonus Revelator io24: loopback mixer included
Built-in digital signal processing (DSP) effects like reverb, delay and compression help give your casts that professional sound.
PreSonus also makes its own pro audio production software called Studio One. The free version is included with all its hardware products. So all you need to get started is a decent mic and a pair of headphones.
If there’s a downside to the PreSonus solution, it has to be the complexity. Make no mistake, this is pro audio gear, meaning there’s a learning curve. But once you get the hang of things, it can become a powerful tool for podcasting and streaming.
Give it a look
Pop quiz: After Google, what’s the most used search engine in the world? No, it’s not Bing or DuckDuckGo. It’s YouTube.
Podcasts are popular, there’s no doubt. But no matter how popular they get, they’ll never match the stunning popularity of YouTube videos.
Here’s another question: What’s the easiest way to take advantage of YouTube’s traffic to promote your podcast? The answer: Just pop on a webcam while you’re recording and give yourself a whole other product to upload.
Experienced podcasters know that investing in a high-quality camera such as the Logitech StreamCam will pay dividends in the end.
Logitech StreamCam: Keeps you centered
The StreamCam, which retails for around $130, shoots 1080p video at up to 60 frames per second (fps) through its premium glass lens. What’s more, this camera is smart enough to keep you centered in the frame even while you’re moving around.
Logitech also very much wants you to know about the StreatCam’s intelligent exposure system. In the company’s words, this “adjusts the aperture and ISO speed in real-time to ensure accurate skin tones for a more natural, healthy look—even in varying lighting conditions.”
Sales revenue + advertising on the cheap
Podcasting can help tech providers realize two kinds of upside.
First is the obvious one: Being a go-to source of podcasting gear is a solid business model.
The second and less obvious one involves putting your podcasting gear to work. Posting tech knowledge out into the Podcastiverse is a good way to pull in new business. If you can build a decent audience, you also open the possibility of gaining paying advertisers.
After all, who doesn’t love a good passive income stream?
A spate of new laptops designed for the post-pandemic hybrid workforce is hitting the market. Major players—including Acer, HP, Lenovo and Microsoft—are focusing intently on office workers who no longer want to work in the office, at least not every day.
The work from home (WFH) and hybrid work movements are growing fast. One big reason: People hate commuting. In a recent Gallup poll, more than half of office workers gave that as their top reason for working remotely.
The tech heavies have taken notice. Everyone, meet the new work/play/remote devices, Class of ’22.
Microsoft: Everything, everywhere, all the time
Microsoft’s new Surface Laptop Go 2 wants to hang out with you all the time. Call it technological co-dependency. This device has an overwhelming need to be your “Go 2” device, no matter where you are or what you’re up to.
This new Surface wants to collaborate with you on spreadsheets and PowerPoint decks. It wants you to pick up its optional Xbox controller and play games in the cloud. And it wants you to slide on a nice-looking pair of (also optional) Bluetooth headphones so you can snuggle together while watching a movie.
Microsoft Surface Laptop Go 2: Good to go at just 2.5 pounds
Surface Laptop Go 2 also wants you to deposit its lithe 2.5-pound body to your bag. That way, you two can be together at home, in the office, on the beach, at the kid’s soccer game, or anywhere else life takes you. And its retail price starts at a pleasingly low $600.
HP: stands for ‘hustle power’
The word hustle appears in rather large letters at the very top of the press release for HP’s new cadre of work/play/remote-focused gear.
Looks like someone over at HP got the memo: During the pandemic, nearly 70% of creators started or expanded their freelance business.
In other words, the side-hustle is now way more hustle — and way less side.
To celebrate this dramatic realization, HP cordially invites you to whip out the ‘ol Mastercard and plunk down $900 to $1,600 on a new lappie designed to work, play and create in equal measure.
Priced at around $1,250, the new HP Spectre x360 will cost you almost twice as much as an entry-level Surface Laptop Go 2. Then again, the extra cheddar delivers a set of specs that run circles around Microsoft’s.
HP Spectre x360: Proprietary software included
You need look only as far as the Intel processor to measure HP’s commitment to performance. Even the default chip is a whole generation beyond what Surface has to offer.
While Microsoft keeps prices low with last year’s silicon, HP is playing to win the performance game. It’s offering 12th gen Intel Core i5 and Core i7 processors running as fast as 4.7GHz.
If speeds and feeds are what you’re into, then shopping for one of these HPs won’t disappoint. Then again, we’ve been there, done that with high-res screens, high-speed SSDs and a whole bunch of RAM.
So what really tells the tale of HP’s mission here? It’s the proprietary software that comes loaded on every machine.
Sure, Spectre is still your run-of-the-mill Windows box. But HP took a play out of the Apple book by including thoughtful, tightly integrated software to sweeten the deal.
Things like Power Saver Mode, In-Bag Detection and Adaptive Battery Optimizer take the guesswork out of managing your power supply when there’s nowhere to plug in.
There’s also HP Pallete, a proprietary digital creative workspace with built-in cross-device collaboration tools.
Plus, there’s HP Auto Frame and HP Dynamic Voice Leveling, which helps the 5 MP camera with sound and video. They’re designed to help you look better on your next Zoom, Teams or Meet video call.
HP wants to help you look your best by improving the lighting, the quality of your voice and — wait for it — your skin, teeth and eyes.
Yes, folks, HP wants to make you pretty. The company says its Appearance Filter is for “the 60% of us who are more self-conscious on camera than in real life.” Indeed.
Everyone else: Acer, Dell et al.
Are your customers partial to other PC brands? Most have you covered for remote work, too.
Acer’s offering is its ConceptD 5 laptop line, which features the latest 12th gen Intel Core CPUs, all the way up to the i9.
Lenovo wants to sell you a ThinkPad C14 Hybrid Workforce Solution (cue the drastic eye-roll about that name). It’s a Chromebook laptop for business users.
And Dell has half a dozen new laptops, too.
Bottom line: Rather than wasting countless hours in rush-hour traffic, we’d all rather stay home and literally phone it in. Now the top PC makers have the laptops to support us.
So go ahead and pick out something nice, whether for yourself, your employees or your customers. Find a piece of gear that works and plays hard, yet travels light.
Choose wisely, and you might find yourself able to work from anywhere in the world. Or just at home, with your feet up on the couch. That’s nice too.
Have you ever seen a $500 million supercomputer? Few people have. But these computing behemoths—there are currently around 500 of them worldwide—work every day to make our lives better.
The newest kid on the block is called Aurora. As you read this, a team of specialized engineers is installing Aurora at the Argonne Leadership Computing Facility in Argonne, Ill.
Aurora: dawn of a new supercomputing era
That means Aurora will be able to process 2 quintillion floating-point operations per second (FLOPS). A quintillion is equal to 1 followed by 18 zeroes. In case you’re wondering, the average desktop PC is capable of about 63 FLOPS. So yeah, this thing is pretty fast.
Recipe for superpower
Would you like to make your very own exascale supercomputer? Cooking a great meal means starting with the right ingredients. Here’s what you’ll need:
> 20,000+ Intel Xeon Scalable CPUs
> 60,000+ Intel Xe GPUs
> 1 Intel DAOS storage system (230 petabyte capacity)
> 10 petabytes of Intel Optane Persistent Memory
> 10,000 compute blades
> A cooling system that could nearly freeze the sun
> $500 million (hint: check under sofa cushions)
But first, you’ll need to locate a few thousand square feet of space near a reliable power source. The Hoover Dam is a decent choice. Or a nuclear power plant if you’re into that sort of thing.
Next, mount your 10,000 blades in an HPE Cray EX supercomputer. Don’t forget to make sure that your Cray architecture includes Slingshot networking.
An individual compute node. Aurora will have 10,000
Install your cooling system—make sure to plug it in, this thing gets hot!—and grab a parka.
Post armed guards around the perimeter, because the world’s most powerful computer can be used for evil as well as good. We wouldn’t want North Korea to get any funny ideas.
Click the on-button, then see what it’s like to use 2 exaflops of power to post a snarky comment on Twitter. Invite some scientists over to cure cancer.
The curing cancer thing isn’t just a joke. If scientists are going to put an end to the 21st century’s version of the Black Plague, they’re going to need a computer like Aurora, which combines AI with high-performance computing (HPC).
The same holds true for those working to deal with our self-imposed climate crisis, as well some less noble but still helpful tasks.
The latter includes complex fluid dynamics modeling to help create more efficient planes, trains and automobiles. Also, some good old science for the sake of science. Little things like exploring adjacent galaxies, creating the next generation of artificial intelligence, and finding the so-called God Particle.
Need another reason? Here’s a good one: Because we can. We don’t yet know the extent of what a 2 exaflop supercomputer can do for humanity. But we know we can build one, and we know we want to find out.
In other words, the journey is the destination. Designing and building Aurora is part of the evolution of humanity. We want to live longer, fly faster and solve mysteries that have confounded us for centuries. Aurora should help.
There will always be the next supercomputer. Just as Alan Turing’s ground-breaking machine begat the IBM Model 5150, which begat that smartphone in your pocket, Aurora and its ilk are the next steps on our journey to the future of computing.
In the short term, that could simply mean the world’s first 8 exaflops computer. In the longer term, perhaps Aurora begets the (possibly terrifying) sentient machines we all know/fear are coming.
Will Aurora’s great-grandchildren want to enslave us? Or will we be smart enough to build them in a way that keeps the Matrix from becoming a reality, yet endows them with the power to help us all live better?
Those are the kind of questions that compel us to load 80,000 chips into 10,000 nodes and then flick the magic switch to see what happens.
What would you like to see happen?
Most small and midsize businesses need a reliable local server. But when the time comes to make a smart purchase, too many SMB owners opt instead to repurpose an older PC.
That’s a bad idea. To be sure, installing a server OS on a dusty old desktop can save money in the short term. But the resulting downtime, repairs and frustration can explode your customer’s total cost of ownership (TCO) in no time.
The smarter play? Tell them to instead install a dedicated low-end server, one that’s purpose-built for the task.
The right tool for the job
Not to put too fine a point on it, but PCs are designed for personal computing, and servers are designed to, well, serve. The devil in the details that separate these two digital archetypes could make all the difference in the world.
PCs simply can’t offer important server-class components such as error-correcting memory, redundant hard-drive arrays and high-speed PCI Express lanes. These features are what set dedicated servers apart from their PC cousins.
Even repurposing a brand-new desktop or mini PC as a server could come back to haunt its owners. Using a box that lacks a true server architecture is just asking for efficiency-destroying bottlenecks.
And the worst-case scenario? The short list includes data loss, security breaches, even a total processor meltdown. Don’t go there.
Spring for the good stuff
If your customers are ready to pull the trigger on a small-business server with a compelling price/performance ratio, they have lots of options.
It probably won’t surprise you to learn that one of the most popular solutions is designed by Dell. The house that Michael built has long been known for its inexpensive business solutions.
Dell’s selection of SMB-focused servers is no exception. Starting at around $700, Dell’s latest PowerEdge servers offer both desktop and rack-mount form factors. The former lends itself to a single-server solution, while the latter offers the option to install a networked stack like the big boys do.
Coming in at entry-level status means a server like the T150 is a small tower housing an Intel Pentium G640T, 8GB of RAM and a 1TB hard drive.
Dell PowerEdge T150 tower
There’s no OS when the PowerEdge comes out of the box, so some assembly is required. Those in the know may want to opt for a stable Linux build. But the path of least resistance will always be a copy of Windows Server 2022.
Moving up Dell’s food chain delivers a substantial price bump. But when it comes to serving scads of media files and MySQL databases while simultaneously collecting and parsing edge data, you have to haul out the big guns.
Kitting out a Dell PowerEdge Rack Server with an Intel Xeon Silver 4309Y server-class processor and 16GB of memory will push the price up to a little over $4K. Going for the ultimate config, however, will quickly launch you into the 5-digit range.
For instance, configuring Dell’s PowerEdge R650 rack server with a 2.4GHz Intel Xeon Platinum 8360Y processor, 32GB of ECC memory, and a couple of 2TB SSDs will easily bring the price tag over $14K. And that’s before you pick out that boffo network controller card you’ve always wanted.
Lots more fish in the sea
Dell isn’t the only choice for your SMB customers — not by a long shot. All of Big Tech is competing for some of that SMB server cash.
However, despite the sheer number of entry-level servers on the market, there’s not a lot of daylight between them. Most offer the same cadre of Intel processors, network cards and memory modules.
Fortunately, there are a few notable innovations. One is HPE, which is pushing an interesting microserver. It’s not often you see a server this small — especially with such impressive specs.
HPE’s ProLiant MicroServer Gen10 Plus starts around $785. Config options include a quad-core Intel Xeon processor, up to 32GB of ECC DDR4 memory, and a 4-port Ethernet controller. Yet the footprint of this server is an almost impossibly small 9.65 inches wide.
HPE ProLiant Microserver: under 10 inches wide
At the other end of the spectrum is Lenovo’s ThinkSystem ST650 V2 Tower Server, which retails for nearly $3,400. Unlike the diminutive HPE, this Lenovo box is built to grow with your customers.
The latest ThinkSystem has room for 2 processors, 2 network cards, 8 drives and 9 PCIe cards. Talk about expandability.
No SMB owner wants to install a new server every year. So the key to success is choosing a platform that’s as future-proof as possible.
That’s where you come in. Helping your customer pick the right box for their business could end up being the most valuable thing you can offer them.
Choose wisely, and you could have a loyal customer for life.
It’s been 50 years since a hand-written, back-of-the-napkin drawing would lead to the Ethernet. And it’s been 40 years since the introduction of the first Ethernet standard brought to you by Intel, Xerox and Digital Equipment Corp. (DEC) and supported by a cast of thousands of engineers.
Since then, the evolution of Ethernet technology has enabled the kind of communications infrastructure early sci-fi writers could only dream about.
Yet today, many of us take the lightning-fast responsiveness to internet searches and apps for granted. If we have to wait 5 seconds for a web page to load, we move on to other things, shaking our fists at the technology gods and cursing our virtual solitary confinement.
But high-speed communications are not a birthright. In fact, they’re a hard-won achievement. Along the way there have been many new inventions to enable Ethernet’s multiple speed gains.
You can view Ethernet’s history in 3 eras, says Gary Gumanow, Intel sales enablement manager for Ethernet, who’s witnessed all three. And now we could be entering a fourth. Let’s take a look.
Era 1 (1980 - 2005): Faster yet also smaller
At the 1980s dawned, engineers from Intel, Xerox and DEC came together to author and standardize an IEEE specification for a communications protocol. What they couldn’t know then was that Ethernet would empower the tech world for decades to come.
Their early work was based on ideals of interoperability, openness and the need for both higher speeds and less delay (aka low latency).
While fast data transfer was just one of Ethernet’s many stated goals, it’s mainly how the technology’s first era has been defined. Early engineers worked tirelessly to achieve transfer rates up to and beyond 2.9 megabits per second (Mbps).
They achieved that goal in 1982. That’s the year Intel shipped the world’s first high-volume 10 Mbps Ethernet controller.
Early Ethernet products from Intel
This herculean effort would kick off many years of R&D. Intel provided the world’s first dual-speed 10/100 Mbps network adapter in 1994. Then the first single-chip 10/100 Mbps Ethernet controller in 1997. And then the first single-chip 10/100/1000 Mbps controller in 2001.
Intel’s 82557 10/100 Ethernet Adapter
Why was a single-chip network controller so revolutionary? Mainly because it ushered in LAN on Motherboard (LOM). This made Ethernet the ubiquitous connection for desktops, servers and embedded devices. Ethernet for everyone!
Era 2 (2005 - 2015): Server virtualization
The path to Ethernet’s second era, virtualization, was forged in 1999 with the standardization of Gigabit Ethernet (aka IEEE 802.3ab). The first compatible products came to market just 2 years later, in 2001.
In a way, Gigabit Ethernet was a solution looking for a problem. That’s because servers running a single application could barely push a Gigabit of data down the wire. Gigabit Ethernet provided more bandwidth than data centers needed.
Ah, but then came virtualization. This meant multiple applications could be consolidated on a single physical server, each application running with its own virtual operating environment. As Intel’s Gumanow said back then, “Every virtual machine needs a Gigabit Ethernet connection.”
Each VM was given its own dedicated, physical 1gigabit Ethernet (GbE) connection. This finally enabled data-center architects to begin using more of their available server resources. Now they could consolidate physical servers, increasing efficiency and creating greater IT scale.
Several years later, we got another 10X improvement with 10Gb Ethernet. Once again, the industry had delivered more bandwidth than we knew what to do with.
But it was also perfect timing for network and server architects. 10GbE solved the problem of sprawling network-adapter ports, cables and switch ports in servers and data-center racks.
But how to virtualize all those network adapters in the server? The solution came from Intel. Through the PCIe SIG, the company introduced Single-Root Input Output Virtualization (SR-IOV).
Most virtualization hypervisors allowed for the emulation of network adapter through software; however, this introduced latency. By contrast, SR-IOV allowed for the splitting up of a network adapter via a “virtual function” on the adapter, instead of running in software on the host processor cores. This also delivered additional improvements in throughput and server efficiencies.
Another technology that helped data-center managers were improvements in network virtualization. One example is Intel’s Virtual Machine Device Queue (VMDQ). This technology aligns transmit and receive queues in hardware with a specific VM host network adapter.
These advances in both hardware and software have enabled virtualization to be the scale engine of the cloud. Today, VMs can host many customer workloads scaled across a single server platform in the cloud.
But networking still needed a method for keeping data and flows secure and separate from one another. Fortunately, from Ethernet’s point of view, the cloud is simply a vast array of remote data centers that collect, process and disseminate data for a wide variety of applications and users. What’s remarkable is the quantity of that data: nearly 100 zettabytes so far.
Data-center virtualization, enabled by Ethernet technology, also gave us access to fast, reliable email and vast databases. It still powers many modern conveniences used today.
Today’s data centers get more done with virtualization
Another benefit of virtualization: It can reduce a data center’s carbon footprint. By enabling multiple VMs on a single physical server, virtualization has helped IT managers use electric power more efficiently.
The resulting decrease in power consumption for processing, cooling and management may not be the silver bullet we need to reverse climate change. But at least it’s a step in the right direction.
Era 3 (2015 - Present): Optimization
Improving how a technology works may not be as sexy as inventing it in the first place. But it’s no less valuable.
Today, Intel and others are considering how best to upgrade Ethernet by improving application responsiveness and reducing latency.
Bandwidth continues to improve with 40GbE, then 25GbE and 100GbE. It should soon move to 200GbE and beyond.
“A chain is only as strong as its weakest link,” Gumarov reminds us. “As apps scale, their responsiveness becomes less predictable. Application Device Queues provides the mechanisms to scale the pipeline between the network, compute and storage.”
Optimization is the process of reclaiming that predictability. It’s a big challenge given the titanic scale on which Ethernet-based systems now operate.
Yet the resulting systems could catalyze advances in mission-critical applications, such those used by healthcare and aerospace, where failure simply isn’t an option.
Era 4: Will there be one?
While the future is uncertain, it’s safe to say Ethernet tech will be with us for some time to come.
Sure, each day brings the possibility of technological advances that offer faster speeds and higher reliability. In fact, many have already come and gone. But it would still take many years to replace the infrastructure that enables Ethernet to push data to even the farthest corners of the planet.
Ethernet is the unseen thread in the fabric of our digital lifestyles. It’s as vital to us now as it’s ever been. Ethernet will still be vital when we wake up tomorrow morning and log on to our devices to check the weather, start a Zoom meeting and navigate to a new destination.
So will the Ethernet have a fourth era? That may be the wrong question. Maybe the right one is: How do we know we’re not already in it?