I am trying to base64 encode a ~66MB zip file to a string and write it to a file using Powershell. I'm working with a limitation that ultimately I have to include the base64 encoded file string directly into a Powershell script, such that when the script is run at a different location, the zip file can be recreated from it. I'm not limited to using Powershell to create the base64 encoded string. It's just what I'm most familiar with.
The code I'm currently using:
$file = 'C:\zipfile.zip'
$filebytes = Get-Content $file -Encoding byte
$fileBytesBase64 = [System.Convert]::ToBase64String($filebytes)
$fileBytesBase64 | Out-File 'C:\base64encodedString.txt'
Previously, the files I have worked with have been small enough that encoding was relatively fast. However, I am now finding that the file I'm encoding results in the process eating up all my RAM and ultimately is untenably slow. I get the feeling that there's a better way to do this, and would appreciate any suggestion.
UPDATE 2023-08-17 for big files, memory usage and speed
As @JohnRanger mentioned, there is a problem with previous answer that it has source file limit to ~1.5 GiB and memory-consumable.
Solution is to use file streams and CryptoStream(... ToBase64Transform...)
⚠️ This code runs over 30 times faster on PS7 compared to PS5
PS7: It takes 3.1 seconds for 5316 MiB file on my machine.
File copying with
[IO.File]::Copy(...)
takes 1.8 seconds.PS5: It takes 115 seconds on my machine. I played with buffer size in
[System.IO.File]::Create(...)
and$converterStream.CopyTo(...)
, but did not get any reasonable performance difference : for 5316 MiB I had worst result of 115 seconds for default buffer sizes and best 112 seconds while playing buffer sizes. Maybe for slow targets buffer sizes will have more impact (i.e. working with slow disk or network share). Performance always hit single core of CPU.File copying with
[IO.File]::Copy()
takes 1.8 seconds.