使用gzip压缩InputStream

27

我想在Java中使用Gzip压缩来压缩输入流。

假设我们有一个未经压缩的输入流(1GB的数据...)。 我想从源获取一个压缩的输入流作为结果:

public InputStream getCompressedStream(InputStream unCompressedStream) {

    // Not working because it's uncompressing the stream, I want the opposite.
    return new GZIPInputStream(unCompressedStream); 

}
12个回答

18

DeflaterInputStream不是您想要的,因为它缺少gzip头/尾,并使用略有不同的压缩方式。

如果您从OutputStream(推送)更改为InputStream(拉取),则需要进行不同的操作。

GzipOutputStream所做的是:

  • 编写静态gzip头
  • 使用DeflaterOutputStream编写已缩小的流。 在编写流时,从未压缩数据中构建CRC32校验和并计算字节的数量
  • 编写包含CRC32校验和和字节数的尾部。

如果您想使用InputStreams执行相同的操作,则需要包含以下内容的流:

  • 头文件
  • 缩小的内容
  • 拖车

这样做的最佳方法是提供3个不同的流并将它们组合成一个流。 幸运的是,有SequenceInputStream可以为您组合流。

这是我的实现以及一个简单的单元测试:

import java.io.ByteArrayInputStream;
import java.io.FileInputStream;
import java.io.FilterInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.SequenceInputStream;
import java.util.Enumeration;
import java.util.zip.CRC32;
import java.util.zip.Deflater;
import java.util.zip.DeflaterInputStream;
import java.util.zip.DeflaterOutputStream;

/**
 * @author mwyraz
 * Wraps an input stream and compresses it's contents. Similiar to DeflateInputStream but adds GZIP-header and trailer
 * See GzipOutputStream for details.
 * LICENSE: Free to use. Contains some lines from GzipOutputStream, so oracle's license might apply as well!
 */
public class GzipCompressingInputStream extends SequenceInputStream
{
    public GzipCompressingInputStream(InputStream in) throws IOException
    {
        this(in,512);
    }
    public GzipCompressingInputStream(InputStream in, int bufferSize) throws IOException
    {
        super(new StatefullGzipStreamEnumerator(in,bufferSize));
    }

    static enum StreamState
    {
        HEADER,
        CONTENT,
        TRAILER
    }

    protected static class StatefullGzipStreamEnumerator implements Enumeration<InputStream>
    {

        protected final InputStream in;
        protected final int bufferSize;
        protected StreamState state;

        public StatefullGzipStreamEnumerator(InputStream in, int bufferSize)
        {
            this.in=in;
            this.bufferSize=bufferSize;
            state=StreamState.HEADER;
        }

        public boolean hasMoreElements()
        {
            return state!=null;
        }
        public InputStream nextElement()
        {
            switch (state)
            {
                case HEADER:
                    state=StreamState.CONTENT;
                    return createHeaderStream();
                case CONTENT:
                    state=StreamState.TRAILER;
                    return createContentStream();
                case TRAILER:
                    state=null;
                    return createTrailerStream();
            }
            return null;
        }

        static final int GZIP_MAGIC = 0x8b1f;
        static final byte[] GZIP_HEADER=new byte[] {
                (byte) GZIP_MAGIC,        // Magic number (short)
                (byte)(GZIP_MAGIC >> 8),  // Magic number (short)
                Deflater.DEFLATED,        // Compression method (CM)
                0,                        // Flags (FLG)
                0,                        // Modification time MTIME (int)
                0,                        // Modification time MTIME (int)
                0,                        // Modification time MTIME (int)
                0,                        // Modification time MTIME (int)
                0,                        // Extra flags (XFLG)
                0                         // Operating system (OS)
        };
        protected InputStream createHeaderStream()
        {
            return new ByteArrayInputStream(GZIP_HEADER);
        }
        protected InternalGzipCompressingInputStream contentStream;
        protected InputStream createContentStream()
        {
            contentStream=new InternalGzipCompressingInputStream(new CRC32InputStream(in), bufferSize);
            return contentStream;
        }
        protected InputStream createTrailerStream()
        {
            return new ByteArrayInputStream(contentStream.createTrailer());
        }
    }

    /**
     * Internal stream without header/trailer  
     */
    protected static class CRC32InputStream extends FilterInputStream
    {
        protected CRC32 crc = new CRC32();
        protected long byteCount;
        public CRC32InputStream(InputStream in)
        {
            super(in);
        }

        @Override
        public int read() throws IOException
        {
            int val=super.read();
            if (val>=0)
            {
                crc.update(val);
                byteCount++;
            }
            return val;
        }
        @Override
        public int read(byte[] b, int off, int len) throws IOException
        {
            len=super.read(b, off, len);
            if (len>=0)
            {
                crc.update(b,off,len);
                byteCount+=len;
            }
            return len;
        }
        public long getCrcValue()
        {
            return crc.getValue();
        }
        public long getByteCount()
        {
            return byteCount;
        }
    }

    /**
     * Internal stream without header/trailer  
     */
    protected static class InternalGzipCompressingInputStream extends DeflaterInputStream
    {
        protected final CRC32InputStream crcIn;
        public InternalGzipCompressingInputStream(CRC32InputStream in, int bufferSize)
        {
            super(in, new Deflater(Deflater.DEFAULT_COMPRESSION, true),bufferSize);
            crcIn=in;
        }
        public void close() throws IOException
        {
            if (in != null)
            {
                try
                {
                    def.end();
                    in.close();
                }
                finally
                {
                    in = null;
                }
            }
        }

        protected final static int TRAILER_SIZE = 8;

        public byte[] createTrailer()
        {
            byte[] trailer= new byte[TRAILER_SIZE];
            writeTrailer(trailer, 0);
            return trailer;
        }

        /*
         * Writes GZIP member trailer to a byte array, starting at a given
         * offset.
         */
        private void writeTrailer(byte[] buf, int offset)
        {
            writeInt((int)crcIn.getCrcValue(), buf, offset); // CRC-32 of uncompr. data
            writeInt((int)crcIn.getByteCount(), buf, offset + 4); // Number of uncompr. bytes
        }

        /*
         * Writes integer in Intel byte order to a byte array, starting at a
         * given offset.
         */
        private void writeInt(int i, byte[] buf, int offset)
        {
            writeShort(i & 0xffff, buf, offset);
            writeShort((i >> 16) & 0xffff, buf, offset + 2);
        }

        /*
         * Writes short integer in Intel byte order to a byte array, starting
         * at a given offset
         */
        private void writeShort(int s, byte[] buf, int offset)
        {
            buf[offset] = (byte)(s & 0xff);
            buf[offset + 1] = (byte)((s >> 8) & 0xff);
        }
    }

}

import static org.junit.Assert.*;

import java.io.ByteArrayInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.util.Arrays;
import java.util.zip.CRC32;
import java.util.zip.GZIPInputStream;

import org.junit.Test;

public class TestGzipCompressingInputStream
{

    @Test
    public void test() throws Exception
    {
        testCompressor("test1 test2 test3");
        testCompressor("1MB binary data",createTestPattern(1024*1024));
        for (int i=0;i<4096;i++)
        {
            testCompressor(i+" bytes of binary data",createTestPattern(i));
        }
    }

    protected byte[] createTestPattern(int size)
    {
        byte[] data=new byte[size];
        byte pattern=0;
        for (int i=0;i<size;i++)
        {
            data[i]=pattern++;
        }
        return data;
    }

    protected void testCompressor(String data) throws IOException
    {
        testCompressor("String: "+data,data.getBytes());
    }
    protected void testCompressor(String dataInfo, byte[] data) throws IOException
    {
        InputStream uncompressedIn=new ByteArrayInputStream(data);
        InputStream compressedIn=new GzipCompressingInputStream(uncompressedIn);
        InputStream uncompressedOut=new GZIPInputStream(compressedIn);

        byte[] result=StreamHelper.readBinaryStream(uncompressedOut);

        assertTrue("Test failed for: "+dataInfo,Arrays.equals(data,result));

    }

}

非常感谢您的回复!我想知道什么时候需要将输入流解压缩,将类转换为GZIPinputStream?目前,我正在从Docker中获取一个输入流,但我想使用您的类立即下载图像到一个压缩流中。然后,我想将该压缩流上传到AWS S3,供其他微服务使用。在从其他微服务下载和处理输入流时,我应该使用GZIPinputStream解压缩类,对吗?在传输过程中,我只能保持输入流压缩吗? - undefined
你还忘了包括你使用的辅助方法的类和函数:StreamHelper.readBinaryStream。 - undefined
为什么我们在输入时还要进行压缩,既然已经有了可以为我们压缩的GZipOutputStream呢? - undefined
你可以使用这个替代缺失的辅助函数,顺便说一句:https://stackoverflow.com/questions/1264709/convert-inputstream-to-byte-array-in-java - undefined

9

我写了一个版本,其中没有CRC/GZIP Magic cookies的内容,因为它委托给了GZIPOutputStream。此外,它在内存效率上也很高,因为它只使用足够的内存来缓冲压缩(一个42MB的文件只使用了45k的缓冲区)。性能与将数据压缩到内存中相同。

import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.util.zip.GZIPOutputStream;

/**
 * Compresses an InputStream in a memory-optimal, on-demand way only compressing enough to fill a buffer.
 * 
 * @author Ben La Monica
 */
public class GZIPCompressingInputStream extends InputStream {

    private InputStream in;
    private GZIPOutputStream gz;
    private OutputStream delegate;
    private byte[] buf = new byte[8192];
    private byte[] readBuf = new byte[8192];
    int read = 0;
    int write = 0;

    public GZIPCompressingInputStream(InputStream in) throws IOException {
        this.in = in;
        this.delegate = new OutputStream() {

            private void growBufferIfNeeded(int len) {
                if ((write + len) >= buf.length) {
                    // grow the array if we don't have enough space to fulfill the incoming data
                    byte[] newbuf = new byte[(buf.length + len) * 2];
                    System.arraycopy(buf, 0, newbuf, 0, buf.length);
                    buf = newbuf;
                }
            }

            @Override
            public void write(byte[] b, int off, int len) throws IOException {
                growBufferIfNeeded(len);
                System.arraycopy(b, off, buf, write, len);
                write += len;
            }

            @Override
            public void write(int b) throws IOException {
                growBufferIfNeeded(1);
                buf[write++] = (byte) b;
            }
        };
        this.gz = new GZIPOutputStream(delegate); 
    }

    @Override
    public int read(byte[] b, int off, int len) throws IOException {
        compressStream();
        int numBytes = Math.min(len, write-read);
        if (numBytes > 0) {
            System.arraycopy(buf, read, b, off, numBytes);
            read += numBytes;
        } else if (len > 0) {
            // if bytes were requested, but we have none, then we're at the end of the stream
            return -1;
        }
        return numBytes;
    }

    private void compressStream() throws IOException {
        // if the reader has caught up with the writer, then zero the positions out
        if (read == write) {
            read = 0;
            write = 0;
        }

        while (write == 0) {
            // feed the gzip stream data until it spits out a block
            int val = in.read(readBuf);
            if (val == -1) {
                // nothing left to do, we've hit the end of the stream. finalize and break out
                gz.close();
                break;
            } else if (val > 0) {
                gz.write(readBuf, 0, val);
            }
        }
    }

    @Override
    public int read() throws IOException {
        compressStream();
        if (write == 0) {
            // write should not be 0 if we were able to get data from compress stream, must mean we're at the end
            return -1;
        } else {
            // reading a single byte
            return buf[read++] & 0xFF;
        }
    }
}

1
看起来可以工作,谢谢!我发现将close方法转发到委托流很有用:@Override public void close() throws IOException { in.close(); } - Dennie

4

看起来我晚了3年,但或许对某些人还是有用的。 我的解决方案与 @Michael Wyraz 的解决方案类似,唯一的区别在于我的解决方案基于 FilterInputStream

import java.io.ByteArrayInputStream;
import java.io.FilterInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.util.zip.CRC32;
import java.util.zip.Deflater;

public class GZipInputStreamDeflater extends FilterInputStream {

    private static enum Stage {
        HEADER,
        DATA,
        FINALIZATION,
        TRAILER,
        FINISH
    }

    private GZipInputStreamDeflater.Stage stage = Stage.HEADER;

    private final Deflater deflater = new Deflater( Deflater.DEFLATED, true );
    private final CRC32 crc = new CRC32();

    /* GZIP header magic number */
    private final static int GZIP_MAGIC = 0x8b1f;

    private ByteArrayInputStream trailer = null;
    private ByteArrayInputStream header = new ByteArrayInputStream( new byte[] {
        (byte) GZIP_MAGIC, // Magic number (short)
        (byte) ( GZIP_MAGIC >> 8 ), // Magic number (short)
        Deflater.DEFLATED, // Compression method (CM)
        0, // Flags (FLG)
        0, // Modification time MTIME (int)
        0, // Modification time MTIME (int)
        0, // Modification time MTIME (int)
        0, // Modification time MTIME (int)
        0, // Extra flags (XFLG)
        0, // Operating system (OS)
    } );

    public GZipInputStreamDeflater(InputStream in) {
        super( in );
        crc.reset();
    }

    @Override
    public int read( byte[] b, int off, int len ) throws IOException {
        int read = -1;

        switch( stage ) {
            case FINISH:
                return -1;
            case HEADER:
                read = header.read( b, off, len );
                if( header.available() == 0 ) {
                    stage = Stage.DATA;
                }
                return read;
            case DATA:
                byte[] b2 = new byte[len];
                read = super.read( b2, 0, len );
                if( read <= 0 ) {
                    stage = Stage.FINALIZATION;
                    deflater.finish();
                    return 0;
                }
                else {
                    deflater.setInput( b2, 0, read );
                    crc.update( b2, 0, read );
                    read = 0;
                    while( !deflater.needsInput() && len - read > 0 ) {
                        read += deflater.deflate( b, off + read, len - read, Deflater.NO_FLUSH );
                    }
                    return read;
                }
            case FINALIZATION:
                if( deflater.finished() ) {
                    stage = Stage.TRAILER;

                    int crcVaue = (int) crc.getValue();
                    int totalIn = deflater.getTotalIn();

                    trailer = new ByteArrayInputStream( new byte[] {
                        (byte) ( crcVaue >> 0 ),
                        (byte) ( crcVaue >> 8 ),
                        (byte) ( crcVaue >> 16 ),
                        (byte) ( crcVaue >> 24 ),

                        (byte) ( totalIn >> 0 ),
                        (byte) ( totalIn >> 8 ),
                        (byte) ( totalIn >> 16 ),
                        (byte) ( totalIn >> 24 ),
                    } );

                    return 0;
                }
                else {
                    read = deflater.deflate( b, off, len, Deflater.FULL_FLUSH );
                    return read;
                }
            case TRAILER:
                read = trailer.read( b, off, len );
                if( trailer.available() == 0 ) {
                    stage = Stage.FINISH;
                }
                return read;
        }
        return -1;
    }

    @Override
    public void close( ) throws IOException {
        super.close();
        deflater.end();
        if( trailer != null ) {
            trailer.close();
        }
        header.close();
    }
}

使用方法:

AmazonS3Client s3client = new AmazonS3Client( ... );
try ( InputStream in = new GZipInputStreamDeflater( new URL( "http://....../very-big-file.csv" ).openStream() ); ) {
    PutObjectRequest putRequest = new PutObjectRequest( "BUCKET-NAME", "/object/key", in, new ObjectMetadata() );
    s3client.putObject( putRequest );
}

1
Nicolai,你考虑过将这个发布为库吗? - Josh Lemer
谢谢你,@JoshLemer。我没有考虑过将其发布为库。你认为它会是一个有用的库吗?它只包含一个类,对于库来说非常小,不是吗? - Nicolai
2
请注意,在使用PutObjectRequest时,不要将Content-Length元数据提供给S3,否则S3 SDK会在内部缓冲整个流以发现长度,这可能会导致内存溢出错误,特别是在上传大文件时,可能会超过内存限制。更安全的做法是使用InitiateMultipartUploadRequest来启动分段上传,然后使用UploadPartRequest,最后使用CompleteMultipartUploadRequest完成上传。 - Ninetou
恰好又晚了三年,我担心这里面有一个bug。在“DATA”状态下,deflater可能会有更多的数据返回,但读取缓冲区可能无法容纳。然而,下一次读取将在第一个defalter.setInput上覆盖此数据。解决方法是确保只有在deflater用尽时才设置更多的输入数据。如果感兴趣,可以在http4k中找到带有此修复程序的Kotlin变体。 - James

4

在流压缩方面,一个可行的示例可以在知名的开源ESB Mule中找到:GZIPCompressorInputStream

它使用JRE提供的DeflaterInputStream进行压缩,添加gzip头并附加gzip尾部(又称页脚)。

不幸的是,它在CPA许可证下,这似乎并不常见。此外,似乎没有单元测试。


2
Apache Commons Compress现在拥有GzipCompressorInputStream - user1585916
11
Apache Commons 实现具有相同的名称,但是它不会压缩 - 根据 JavaDoc:用于解压缩 .gz 文件的输入流。 - Robert
3
@Apache 这个类的名字真是太糟糕了。 - mjaggard

4

如果您不想将内容加载到大字节数组中,并且需要真正的流式解决方案:

package x.y.z;

import org.apache.commons.io.IOUtils;

import java.io.*;
import java.util.Arrays;
import java.util.List;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.zip.GZIPInputStream;
import java.util.zip.GZIPOutputStream;
import java.util.zip.ZipOutputStream;

/**
 * Stream Compression Utility
 *
 * @author Thamme Gowda N
 */
public enum CompressionUtil {
    INSTANCE;

    public static final int NUM_THREADS = 5;
    private final ExecutorService pool;

    CompressionUtil(){
        this.pool = Executors.newFixedThreadPool(NUM_THREADS);
    }

    public static CompressionUtil getInstance(){
        return INSTANCE;
    }

    /**
     * Supported compression type names
     */
    public static enum CompressionType {
        GZIP,
        ZIP
    }

    /**
     * Wraps the given stream in a Compressor stream based on given type
     * @param sourceStream : Stream to be wrapped
     * @param type         : Compression type
     * @return source stream wrapped in a compressor stream
     * @throws IOException when some thing bad happens
     */
    public static OutputStream getCompressionWrapper(OutputStream sourceStream,
                                     CompressionType type) throws IOException {

        switch (type) {
            case GZIP:
                return new GZIPOutputStream(sourceStream);
            case ZIP:
                return new ZipOutputStream(sourceStream);
            default:
                throw new IllegalArgumentException("Possible values :"
                        + Arrays.toString(CompressionType.values()));
        }
    }

    /**
     * Gets Compressed Stream for given input Stream
     * @param sourceStream  : Input Stream to be compressed to
     * @param type: Compression types such as GZIP
     * @return  Compressed Stream
     * @throws IOException when some thing bad happens
     */
    public static InputStream getCompressedStream(final InputStream sourceStream,
                                    CompressionType type ) throws IOException {

        if(sourceStream == null) {
            throw new IllegalArgumentException("Source Stream cannot be NULL");
        }

        /**
         *  sourceStream --> zipperOutStream(->intermediateStream -)--> resultStream
         */
        final PipedInputStream resultStream = new PipedInputStream();
        final PipedOutputStream intermediateStream = new PipedOutputStream(resultStream);
        final OutputStream zipperOutStream = getCompressionWrapper(intermediateStream, type);

        Runnable copyTask = new Runnable() {

            @Override
            public void run() {
                try {
                    int c;
                    while((c = sourceStream.read()) >= 0) {
                        zipperOutStream.write(c);
                    }
                    zipperOutStream.flush();
                } catch (IOException e) {
                    IOUtils.closeQuietly(resultStream);  // close it on error case only
                    throw new RuntimeException(e);
                } finally {
                    // close source stream and intermediate streams
                    IOUtils.closeQuietly(sourceStream);
                    IOUtils.closeQuietly(zipperOutStream);
                    IOUtils.closeQuietly(intermediateStream);
                }
            }
        };
        getInstance().pool.submit(copyTask);
        return resultStream;
    }

    public static void main(String[] args) throws IOException {
        String input = "abcdefghij";
        InputStream sourceStream = new ByteArrayInputStream(input.getBytes());
        InputStream compressedStream =
                getCompressedStream(sourceStream, CompressionType.GZIP);

        GZIPInputStream decompressedStream = new GZIPInputStream(compressedStream);
        List<String> lines = IOUtils.readLines(decompressedStream);
        String output = lines.get(0);
        System.out.println("test passed ? " + input.equals(output));

    }
}

3

PipedOutputStream 允许您将数据写入 GZIPOutputStream 并通过 InputStream 暴露该数据。它具有固定的内存成本,不像其他解决方案将整个数据流缓冲到数组或文件中。唯一的问题是您不能从同一个线程读取和写入,必须使用单独的线程。

private InputStream gzipInputStream(InputStream in) throws IOException {
    PipedInputStream zipped = new PipedInputStream();
    PipedOutputStream pipe = new PipedOutputStream(zipped);
    new Thread(
            () -> {
                try(OutputStream zipper = new GZIPOutputStream(pipe)){
                    IOUtils.copy(in, zipper);
                } catch (IOException e) {
                    e.printStackTrace();
                }
            }
    ).start();
    return zipped;
}

1
一个不错的解决方案。唯一让我头疼的是如何将可能出现的 IOException 从线程中取出,以免被忽略。 - Robert
我认为在那种情况下,我会像@smartwjw在他的回答中建议的那样使用EasyStream的OutputStreamToInputStream。它在幕后基本上执行此操作,但文档说它通过getResult()公开任何异常。 - MikeFHay
作为评论,经过尝试并发现它无法工作后,不要尝试将PipedInputStreamPipedOutputStream声明放在try-with-resources声明中,例如: try (InputStream is = ... ; //whatever to open the input stream PipedInputStream zipped = new PipedInputStream(); PipedOutputStream pipe = new PipedOutputStream(zipped);) { new Thread(....) //as above这是错误的!如果你想这样做,请将上面的输入流打开放在try-with-resources声明之上,并将其余部分放在try块中。 - SteveR

2
为了压缩数据,您需要使用GZIPOutputStream。但是,由于您需要像从InputStream读取数据一样读取数据,因此需要将OutputStream转换为InputStream。您可以使用getBytes()来完成此操作:
GZIPOutputStream gout = new GZIPOutputStream(out);
//... Code to read from your original uncompressed data and write to out.

//Convert to InputStream.
new ByteArrayInputStream(gout.getBytes());

但是这种方法的限制在于你需要先读取所有数据 - 这意味着你必须有足够的内存来容纳该缓冲区。

在这个线程中提到了使用管道的替代方法 - 如何将OutputStream转换为InputStream?


1
@Fabien - 看到你下面的评论了 - 如果你确实有1TB的数据要从输入流中读取 - 不要使用上面的方法!!除非你有1TB的内存可以浪费。使用管道方法。 - kjp
没有一种“好”的标准方法可以使OutputStream成为InputStream。只有两种方法可以实现。要么将整个OutputSTream缓存到某个地方,要么使用线程。两者都有其缺点。如果可以的话,请使用InputStream。请参阅下面我的帖子。 - Michael Wyraz
3
@kjp - GZIPOutputStream没有默认构造函数。 - Giovanni Giachetti
1
请更新答案,不确定它在2012年是否有效,但我知道它在2016年无效。 - Dici

2

JRE中没有DeflatingGZIPInputStream类。如果要使用"deflate"压缩格式进行解压缩,请使用java.util.zip.DeflaterInputStreamjava.util.zip.DeflaterOutputStream

public InputStream getCompressedStream(InputStream unCompressedStream) {
    return new DeflaterInputStream(unCompressedStream); 
}

您可以从java.util.zip.GZIPOutputStream源代码中查看,派生一个类来自java.util.zip.DeflaterInputStream,以GZIP格式进行解压缩。

1

在这种情况下,你不应该看一下GZIPOutputStream吗?

public OutputStream getCompressedStream(InputStream input) {
    OutputStream output = new GZIPOutputStream(new ByteArrayOutputStream()); 
    IOUtils.copy(input, output);
    return output;
}

GZIPOutputStream 返回一个 OutputStream。但我需要一个 InputStream。 - Fabien
你不能从一个输入创建另一个输入,只能创建输出。 - adarshr

1
你可以使用 EasyStream
try(final InputStreamFromOutputStream<Void> isOs = new InputStreamFromOutputStream<Void>() {
    @Override
    protected void produce(final OutputStream dataSink) throws Exception {
        InputStream in = new GZIPInputStream(unCompressedStream);
        IOUtils.copy(in, dataSink);
    }
}) {        

    //You can use the compressed input stream here

} catch (final IOException e) {
    //Handle exceptions here
} 

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接