该脚本应复制文件并计算它们的哈希值总和。我的目标是使该函数读取文件而不是3(read_for_copy + read_for_hash + read_for_another_copy)一次,以最大程度地减少网络负载。因此,我尝试读取一块文件,然后计算md5哈希和并将文件写出到几个位置。文件大小可能从100 MB到2 TB,甚至更大。此时无需检查文件身份,只需计算初始文件的哈希值即可。
而且我对计算散列总和感到困惑:
$ifile = "C:\Users\User\Desktop\inputfile"
$ofile = "C:\Users\User\Desktop\outputfile_1"
$ofile2 = "C:\Users\User\Desktop\outputfile_2"
$md5 = new-object -TypeName System.Security.Cryptography.MD5CryptoServiceProvider
$bufferSize = 10mb
$stream = [System.IO.File]::OpenRead($ifile)
$makenew = [System.IO.File]::OpenWrite($ofile)
$makenew2 = [System.IO.File]::OpenWrite($ofile2)
$buffer = new-object Byte[] $bufferSize
while ( $stream.Position -lt $stream.Length ) {
$bytesRead = $stream.Read($buffer, 0, $bufferSize)
$makenew.Write($buffer, 0, $bytesread)
$makenew2.Write($buffer, 0, $bytesread)
# I am stuck here
$hash = [System.BitConverter]::ToString($md5.ComputeHash($buffer)) -replace "-",""
}
$stream.Close()
$makenew.Close()
$makenew2.Close()
如何收集数据块以计算整个文件的哈希?
还有一个额外的问题:是否可以在并行模式下计算哈希并写出数据?特别考虑到workflow {parallel{}}
PS版本6不支持的内容?
非常感谢
最终上市
$ifile = "C:\Users\User\Desktop\inputfile"
$ofile = "C:\Users\User\Desktop\outputfile_1"
$ofile2 = "C:\Users\User\Desktop\outputfile_2"
$md5 = new-object -TypeName System.Security.Cryptography.MD5CryptoServiceProvider
$bufferSize = 1mb
$stream = [System.IO.File]::OpenRead($ifile)
$makenew = [System.IO.File]::OpenWrite($ofile)
$makenew2 = [System.IO.File]::OpenWrite($ofile2)
$buffer = new-object Byte[] $bufferSize
while ( $stream.Position -lt $stream.Length )
{
$bytesRead = $stream.Read($buffer, 0, $bufferSize)
$makenew.Write($buffer, 0, $bytesread)
$makenew2.Write($buffer, 0, $bytesread)
$hash = $md5.TransformBlock($buffer, 0 , $bytesRead, $null , 0)
}
$md5.TransformFinalBlock([byte[]]::new(0), 0, 0)
$hash = [BitConverter]::ToString($md5.Hash).Replace('-','')
$hash
$stream.Flush()
$stream.Close()
$makenew.Flush()
$makenew.Close()
$makenew2.Flush()
$makenew2.Close()
本文收集自互联网,转载请注明来源。
如有侵权,请联系[email protected] 删除。
我来说两句