IMPROVISED SECURITY AND COMPRESSION ALGORITHM ON HUFFMAN ENCODING

1M Vijayakumar Dr. Shiny Angel

170 Views
39 Downloads
Abstract:

Huffman Encoding is an absolute algorithm of data encryption and size compression that converts streams of multi-typed data into streams of binary data and enhances the pattern occurrences in the same. The yield of huffman encoding calculation is a specific kind of ideal prefix code, generally referred as lossless data compression. A combination of two previously defined but now obsolete algorithms of data compression that, when combined and executed over test cases of textual data can compress at massive ratios. The procedure also involves a minimal amount of system resources such as memory and RAM. This feature makes the procedure immensely efficient and beneficial to cases of data transmission where the system resources available are less and/or the bandwidth of the channel is insufficient. The method of de-duplication works efficiently on data with larger inherent or generated patterns. Hence, the output of Huffman Encoding will work as an appropriate input for the de duplication module. Data de-duplication, which can also be referred as an efficient compression is a process that can be used to eliminate redundant patterns present in data thus decreasing the storage overhead. It makes sure that only one unique illustration of a pattern is maintained on the source of storage. Repeated data blocks in files are substituted with a referential pointer that points to its unique copy of data pattern.

Keywords:

Data Encryption and Compression, Redundancy, Storage Overhead Reduction, de-duplication, optimal prefix code

Paper Details
Month3
Year2020
Volume24
IssueIssue 6
Pages4319-4325