我正在使用Node的stream.pipeline
功能将一些数据上传到S3。我正在实现的基本思想是从请求中提取文件并将它们写入S3。我有一个pipeline
可以成功地从请求中提取zip文件并将其写入S3。但是,我希望我的第二个pipeline
执行相同的请求,但解压缩并将未压缩的文件写入S3。管道代码如下所示:
pipeline(request.get(...), s3Stream(zipFileWritePath)),
pipeline(request.get(...), new unzipper.Parse(), etl.map(entry => entry.pipe(s3Stream(createWritePath(writePath, entry)))))
s3Stream函数看起来是这样的:
function s3Stream(file) {
const pass = new stream.PassThrough()
s3Store.upload(file, pass)
return pass
}
The first pipeline
works well, and is currently operating greatly in production. However, when adding the second pipeline, I get the following error:
pipeline
表现良好,并且目前在生产中运行良好。但是,当添加第二个pipeline时,我遇到了以下错误:Error [ERR_STREAM_PREMATURE_CLOSE]: Premature close
at Parse.onclose (internal/streams/end-of-stream.js:56:36)
at Parse.emit (events.js:187:15)
at Parse.EventEmitter.emit (domain.js:442:20)
at Parse.<anonymous> (/node_modules/unzipper/lib/parse.js:28:10)
at Parse.emit (events.js:187:15)
at Parse.EventEmitter.emit (domain.js:442:20)
at finishMaybe (_stream_writable.js:641:14)
at afterWrite (_stream_writable.js:481:3)
at onwrite (_stream_writable.js:471:7)
at /node_modules/unzipper/lib/PullStream.js:70:11
at afterWrite (_stream_writable.js:480:3)
at process._tickCallback (internal/process/next_tick.js:63:19)
任何想法是什么导致这种情况或解决此问题的解决方案将不胜感激!