[Search for users] [Overall Top Noters] [List of all Conferences] [Download this site]

Conference rusure::math

Title:Mathematics at DEC
Moderator:RUSURE::EDP
Created:Mon Feb 03 1986
Last Modified:Fri Jun 06 1997
Last Successful Update:Fri Jun 06 1997
Number of topics:2083
Total number of notes:14613

924.0. "Data Reduction For Binary Images With Limted Stack" by WONDER::NORTON (Charles McKinley Norton | MSB) Fri Sep 02 1988 01:33

    Is the Lempel-Ziv data compression algorithm noted in 336 suitable for
    compressing binary images?  I ask this becasue 334.last mentions that
    Lempel-Ziv was derived from a more complicated algorithm for binary
    images. Would Lempel-Ziv it fare well in a system code environment with
    a limited stack space? If not, are there simpler algorithms to
    implement that might not have the same reduction capabilitites but
    would peform useful reduction. 
    
    Our application is an image for the EEPROM on our CPU board.  The
    EEPROM image and its blasting code are piggy-backed onto the console
    code in one ROM.  Space is tight, and we need a way to reduce the
    EEPROM image significantly, without making a great increase in the
    blasting code, which is about 200 (hex) bytes.  Currently the entire
    EEPROM image is blasted into the ROM.
    
    The image is divided into several sections. Each section starts out
    with a normal pattern of data, followed by fill characters (0) until
    the beginning of the next section. 
    
    I could hard code the data reduction to blast the varying portion of
    each section, loop to blast the zeroes to the end of that section and
    repeat this for each section.  That information is static and
    reasonably easy to code into the blasting routine.  Before doing the
    brute force method, however, I wanted to inquire if a data reduction
    algorithm would make more sense. 
                                       
    Thanks.
    
    Charles Norton
    Rigel System Integration/Console Development
    Low-End Mid-Range VAX Systems Business | BXB2-2/F02 | DTN: 293-5487
                                     
T.RTitleUserPersonal
Name
DateLines
924.1Try it and see.AKQJ10::YARBROUGHI prefer PiFri Sep 02 1988 12:4022
>    Is the Lempel-Ziv data compression algorithm noted in 336 suitable for
>    compressing binary images?

Yes and no. It depends on what's in the binary image. I have seen 40%
reduction of large .exe files and BACKUP save sets, using the L-Z 
algorithm.

I have also seen binary files grow LARGER when a compression algorithm is
applied. This behavior is to be expected: if you start with a file that has
already been compressed, or that has, for other reasons, very low
redundancy to begin with, you cannot expect further compression, and in
fact slight growth in size can be predicted. This effect is independent of
the compression algorithm chosen. 

So what you need to do is to run the algorithm on your image and see how 
much it compresses. If you find that it compresses satisfactorily, then 
measure how long it takes to decompress and see if you can live with that
startup time in your application.

LZCOMP in the ToolShed has the L-Z algorithm; give it a try.

Lynn Yarbrough 
924.2Reduction and Stack Space Only Constraints/TnxWONDER::NORTONCharles McKinley Norton | MSBFri Sep 02 1988 15:266
    Thanks.
    
    Fortunately, time is not an issue.  It's only if the algorithm reduces
    that counts. Thanks for the pointers. 
    
    CMN