Wendell Wilson is one founders of Level 1 Techs, which is, as the website describes a “a group of nerds who are passionate about technology and how it shapes our world.” They create videos to share their knowledge about tech, science, and design. As explained on the website, “At one time or another, we were all Level1; we all had to start somewhere. Our aim for our channel and our community is to be that ‘somewhere’ for a new generation of nerds. We aim to bring together both technology enthusiasts and professionals to accomplish more using our collective minds, skills and resources.”
Here is our exclusive interview with Wendell about his journey to becoming a tech influencer. [This interview has been edited for clarity and length].
Please tell us about you and your experience. How did you get started? And how Level1 Techs and your own consultancy evolve together?
I grew up in a place that was relatively rural and isolated; technology let me connect me with people and knowledge in a way that kept me engaged, and so it became a fixture in my life. I love programming, working on computers and problem solving. I love getting to understand things. I love processes and automating things. So, it was only natural that, after getting a CS degree in college, I'd find my way to general technology consulting. It's never the same and there is never a day that is boring. It has not been easy, but being so absorbed by and learning as much about everything as I possibly could since before I was a teenager really helped me a lot, I think.
Level1Techs is still evolving; while historically it has mostly been a hobby and a fun thing to do online, it has also let me connect with people—entire industries, really—in a way that I would never have been able to with a traditional existence. The insight from being on the periphery of so many talented folks working all across the industry I find to be very enlightening.
Tell us about the joke core to the name of Level1, and why the concept is core to the Level1 ethos.
Working with technology can be stressful; far too many people working in the industry have no outlet and take it far too seriously. Level1 sounds good to outsiders, but anyone working in a technical industry knows that the Level1 people are just the gatekeeper bozos you have to go through before you can get to the real people that know what they're doing. I think it's probably a good character trait if you live in near-constant fear of espousing what you know from atop Mt. Stupid (For the non-xkcd Six-Sigma types, that's the Dunning-Kreuger effect zone).
There is a strong crossover between the gaming community and the data center crowd, and it's clear from your content that your own passions are what have drawn these kinds of followers. What's your favorite part about the well-documented relationship between these communities? Any fun anecdotes that stick out?
My favorite one for the C-level executives is to have them find out who, in line for promotion to team lead or team management, games. If you can manage a guild or raiding party (generally) you can manage developers. Of course, if they're already managing an internal team, or two, then even better. I love it when people have hobbies that give them skills for their jobs; gaming very often does.
What kinds of content do your followers geek out the most over?
It varies a lot. Everyone seems to like "How does it work?" type content whether that is about open source, servers or even just how to research what makes a good product (for a particular type of product). Some of our content where we just wax poetic about industry experience also seems to be popular.
When data center problems are most intriguing for you to solve? What is the most fun kind of thing for you to geek out about?
If I get some more time, I'd like to get back into robotics and AI. I did some really cool stuff as an Undergrad but I hadn't had any time in my career for this type of work.
In the datacenter it's all about efficiency and getting the most compute (or the most flexibility) per dollar. Containerization solves a lot of problems, but it isn't (yet) super great for workloads where you want to optimize for specific parameters like cache utilization, latencies, storage throughput (or latency), GPU, compute, etc. It still depends on an analyst or sysadmin to know what kind of work is being done to help pick the right platform and architecture on which to host those containers. There is a tremendous opportunity in the market to blend edge compute with cloud compute in a way that gives you the flexibility and horsepower of the cloud, but the immediacy of arms-reach resources.
You've played with Liqid NVMe solutions in the past. What was your experience with that?
Liqid's work on NVMe solutions are really just a sort of "gateway drug" for thinking about what comes next.
What initially got me thinking this way was Amazon's EC2 instances with (relatively small) NVMe. How do they do that? Can you slice up an NVMe securely, and with little overhead, for multiple tenants? Can you handle a multi-user workload and prevent one user from monopolizing the whole thing? Liqid's already put in the work there to not only answer those questions but also fill the gaps major players leave out. For power users like me, with lots of virtual machines running lots of real work, I gotta go fast. If I'm scanning a 2-terabyte dataset in the background I want my apps in the foreground to remain zippy and responsive. Storage latency, more than throughput in most cases, is often the culprit when your machine feels sluggish.
You’ve said that you’re really excited about composable infrastructure. Tell us what you think the future holds for composable?
[Ed.'s Note: Liqid can also compose for Ethernet, Infiniband, and other commercially available fabrics].
I am vicariously excited for all my overworked stressed compadres in the data centers. Business Units' goals change; mission creep happens; scope creep happens. For the folks that already have composable infrastructure, how cool is it that admins were able to pivot to a 100% work-from-home infrastructure immediately, overnight simply by reconfiguring their composable infrastructure? It's finally possible to have physical infrastructure as flexible as containers, but with direct access to PCIe hardware like GPUs for compute. Autoscaling is easily possible without having to rely on the big cloud providers.
Composable is a game changer because it easily, and simply, enables the PCIe fabric to be *the* communications fabric. Connecting a bunch of servers in a rack via high-speed low-latency PCIe is only the beginning—combine with SR-IOV with NICs, Storage, GPUs and more plus technologies like NVIDIA's GRID, and you can literally have a generic compute rack that can, in a matter of minutes, go from full-fat GPU-accelerated AI research projects to "emergency" Virtual Desktop Infrastructure (VDI) host machines. No one in the data center has to lift a finger. You could even use the PCIe fabric for high-speed interprocess communication across machines in the same cluster!
What’s on the horizon for L1?
I would really love to build out a lot more "recipes" for server and higher-end stuff. We've got some really great recipes for virtualization folks—cross-platform VFIO; the ultimate cross-platform developer workstation series; configuring your Linux workstation for max awesomeness; perf tuning ZFS; etc.,—but I want to go farther: Setting up 8-24 channel NVMe arrays, DIY servers. I've recruited some really neat folks in the community who work in the industry and we've written some content on things like Kubernetes together.
It's a promising start, I think. I'm working on fixing up some more physical space for L1 activities and for longer-running experiments. I have a couple generations of Threadripper machines around the office now and before that I have some fairly nice older 1-2p Xeon servers that were cast-offs from my clients. The amount of compute I have at my fingertips really drives home two points for me: 1) cloud compute is Actually Kind of Expensive and 2) It is the most exciting time in history, for servers and the datacenter space because all these new innovations are driving down compute costs. (Which, eventually, will make its way to the cloud...)
Visit Level 1 Techs to see all their video reviews, articles, and geek out with other tech-savvy folks in their forum.