Hey everyone! I was asked by a coworker to write a script that can retrieve the total sizes and space utilization of specific shared folders on a server share. I figured it would be a straightforward task, but I ended up struggling, especially when trying to get the size info for the folders, which seemed to take forever. My coworker ended up doing it manually and found that the combined size of two folders was around 2TB, so I know there's a lot of data involved. Is there an efficient way to compute this using PowerShell? Here's my code that didn't work as intended:
```powershell
$paths = @("\serversharefoldername1", "\serversharefoldername2")
$totalSize = 0
$freeSpace = 0
foreach ($uncPath in $paths) {
$drive = New-Object -ComObject Scripting.FileSystemObject
$folder = $drive.GetFolder($uncPath)
$thisTotal = $folder.Drive.TotalSize
$thisFree = $folder.Drive.FreeSpace
$totalSize += $thisTotal
$freeSpace += $thisFree
}
$thisTotalTB = $thisTotal / 1TB
$thisFreeTB = $thisFree / 1TB
$thisUsedTB = ($thisTotal - $thisFree) / 1TB
$thisUsedPct = (($thisTotal - $thisFree) / $thisTotal) * 100
$thisFreePct = ($thisFree / $thisTotal) * 100
Write-Host "Combined Totals" -foregroundcolor cyan
Write-Host (" Total Size: {0:N2} TB" -f $thisTotalTB)
Write-Host (" Free Space: {0:N2} TB" -f $thisFreeTB)
Write-Host (" Used Space: {0:N2} TB" -f $thisUsedTB)
Write-Host (" Used Space %: {0:N2}%" -f $thisUsedPct)
Write-Host (" Free Space %: {0:N2}%" -f $thisFreePct)
```
Any advice on how to make this run faster would be awesome!
6 Answers
When you say your coworker did it 'manually', do you mean he right-clicked the folder and waited for it to calculate? PowerShell can do it as well, but it’s usually slower because of the way it's processing. You could parallelize your folder checks to speed things up.
For accuracy, you’ll want to know that there's no 'free space per folder' on a shared network drive. The free space is tied to the disk it resides on, not the folders themselves. Focus on calculating folder sizes relative to total disk space instead.
Have you tried using `robocopy`? It's really fast for retrieving folder sizes, even over a network. You just send it to a null target and then process the results.
If you just need the size of folders and subfolders, you can use the `Get-ChildItem` command with `Measure-Object`. A quick command would be something like this:
```powershell
(Get-ChildItem C:temp -Recurse | Measure-Object -Property Length -Sum).Sum / 1GB
```
This should be much faster than your current approach!
Good tip! But for OP's case, using `Scripting.FileSystemObject` might be better for folder size as it handles properties more efficiently.
If the server's a Windows system, consider using tools like `WizTree` directly on the server. You can run it with `Invoke-Command` to export results to CSV for easy processing.
Calculating folder sizes over a network can be really slow. A better approach is to run a remote command directly on the server where the files are stored. That way, all calculations happen locally, which should speed things up. You might want to check out how to create a PowerShell session or use `Invoke-Command` to run your script on the server directly.
Totally agree! You're right about SMB slowing things down. Using a remote session will definitely be faster. You could also look into selecting files directly from the server instead of accessing them over the network.
Right! Running commands in parallel could definitely reduce wait time. Just be careful with memory management!