In one of my previous blogpost I was converting binary files to/from base64
I used naive straightforward approach
[Convert]::ToBase64String($bytes) [Convert]::FromBase64String($base64String)
The problem with this approach is that if you need to convert big binary file, you have to read the whole file into memory. This consumes a lot of memory and a lot of time.
Better approach would be to read file with chunks and convert the chunks. Due to the base64 nature size of chunks should be multiplier of 3 when convert binary->base64 and a multiplier of 4 for reverse conversion
function ConvertTo-Base64 { param ( [string] $SourceFilePath, [string] $TargetFilePath ) $SourceFilePath = Resolve-PathSafe $SourceFilePath $TargetFilePath = Resolve-PathSafe $TargetFilePath $bufferSize = 9000 # should be a multiplier of 3 $buffer = New-Object byte[] $bufferSize $reader = [System.IO.File]::OpenRead($SourceFilePath) $writer = [System.IO.File]::CreateText($TargetFilePath) $bytesRead = 0 do { $bytesRead = $reader.Read($buffer, 0, $bufferSize); $writer.Write([Convert]::ToBase64String($buffer, 0, $bytesRead)); } while ($bytesRead -eq $bufferSize); $reader.Dispose() $writer.Dispose() } function ConvertFrom-Base64 { param ( [string] $SourceFilePath, [string] $TargetFilePath ) $SourceFilePath = Resolve-PathSafe $SourceFilePath $TargetFilePath = Resolve-PathSafe $TargetFilePath $bufferSize = 9000 # should be a multiplier of 4 $buffer = New-Object char[] $bufferSize $reader = [System.IO.File]::OpenText($SourceFilePath) $writer = [System.IO.File]::OpenWrite($TargetFilePath) $bytesRead = 0 do { $bytesRead = $reader.Read($buffer, 0, $bufferSize); $bytes = [Convert]::FromBase64CharArray($buffer, 0, $bytesRead); $writer.Write($bytes, 0, $bytes.Length); } while ($bytesRead -eq $bufferSize); $reader.Dispose() $writer.Dispose() } function Resolve-PathSafe { param ( [string] $Path ) $ExecutionContext.SessionState.Path.GetUnresolvedProviderPathFromPSPath($Path) }
Note that we used [System.IO.Path]::GetFullPath instead of Resolve-Path, because Resolve-Path works with existing files only, and we need to deal with non-existent $TargetFilePath files
With that approach, when I converted 10mb file it completed within a second, whereas for the naive approach it took hours
UPD: Later after the blogpost I realized that [System.IO.Path]::GetFullPath won’t work properly and we need to use Resolve-PathSafe approach
Code above was updated to reflect that
I just want to thank you for posting this article. It was quite helpful! I love it when a script comes together…
Is it possible to have the base64 string in a variable and decode it using a buffer?
This is amazing. I tried converting a 150KB base64 image. This script takes 0.0045s while the built-in one takes 545 times longer at 2.44s.
Definitely a life-saver.
Amazing. Thank you so much. I’ve spent the whole afternoon struggling with this and the problem is solved in 5 minutes, after I’ve read your post. Have you ever imagined someone reading your post after those 7 years?
I don’t have any clue in powershell scripting , do i need to give whole file path including file name for target path and source path ?
Dude! Thank you. It ran far too quick that I didn’t think it actually worked