Efficiently Deploying Windows 11 to College Lab PCs Without a Windows Server

0
6
Asked By TechieTurtle92 On

Hey everyone, I'm looking for advice on deploying Windows 11 across about 50 lab PCs in a college setting. I'm currently a Co-op student in the ICS department, where we aim to streamline our Windows deployment process for our student workstations. We've traditionally used Clonezilla for deploying a master image, but we've run into some issues that make it more manual than we'd like.

We want a solution that allows us to maintain a single golden image that includes the latest updates and software, and we need to deploy it quickly to all lab machines at the start of each academic year. Our lab environment consists of 48 workstations spread across four rooms, with varying hardware configurations.

Currently, we use a base image with essential applications like Wireshark, VMware Workstation, and Microsoft Office 365, and we want to keep this setup while minimizing manual config. We've experimented with Sysprep and FOG, but ran into issues with things like user profile settings being reset after deployment.

Given our constraints—a limited budget and no Windows Server—is continuing with FOG and Sysprep the best option, or is there a more efficient workflow for our scenario? Any suggestions or experiences you can share would be greatly appreciated!

4 Answers

Answered By QuickFixIt On

Have you considered using Serva for your deployment? It's a handy tool for booting multiple machines and can help you get everything set up quickly. Just create a portable USB drive with it, and it can handle deploying your base image efficiently!

ImageInnovator88 -

Serva sounds interesting! I’ll have to look into that more—anything that speeds up deployment and simplifies my process is definitely worth exploring.

Answered By CampusTechie77 On

You might want to try scripting the installation of your essential applications. I created a script that installs multiple programs and configurations; it was a real time-saver. I just need to run it from a USB or over the network, and I can walk away while it installs everything. It could help avoid some of those manual post-deployment tasks you're running into!

ScriptySam -

That’s a great point! Automating the installs can definitely save you a lot of time and hassle. I find it also reduces human error when setting everything up.

Answered By GizmoGuru44 On

If I were in your shoes, I’d consider setting up OSDCloud on a Debian machine. You can host a TFTP and HTTP share on something inexpensive, like a Raspberry Pi. You’d then configure your DHCP options for network booting. This way, you can easily re-image your PCs just by rebooting them and letting the network handle the rest. Once your image is set up, updating it would be as simple as swapping out the ISO file. It minimizes the headaches with golden images since you can keep your configurations separate and managed with Powershell scripts!

NerdyNetworker29 -

That sounds like a solid plan! I remember doing something similar with PXE booting, which made it super easy to deploy images across multiple machines quickly. Definitely let your networking team handle those DHCP settings.

Answered By SystemSage On

Another approach would be to see if Ghost is still relevant in today’s environment. Setting it up for your images could allow you to deploy multiple machines simultaneously. It might bring back some nostalgia for old-school imaging techniques and get the job done effectively!

RetroTechie -

Ghost was a game-changer back in the day; it can still be quite useful for batch imaging. Definitely a suggestion worth considering!

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.