Error-Tolerant Big Data Processing

12/18/2017
by   Dong Deng, et al.
0

Real-world data contains various kinds of errors. Before analyzing data, one usually needs to process the raw data. However, traditional data processing based on exactly match often misses lots of valid information. To get high-quality analysis results and fit in the big data era, this thesis studies the error-tolerant big data processing. As most of the data in real world can be represented as a sequence or a set, this thesis utilizes the widely-used sequence-based and set-based similar functions to tolerate errors in data processing and studies the approximate entity extraction, similarity join and similarity search problems. The main contributions of this thesis include: 1. This thesis proposes a unified framework to support approximate entity extraction with both sequence-based and set-based similarity functions simultaneously. The experiments show that the unified framework can improve the state-of-the-art methods by 1 to 2 orders of magnitude. 2. This thesis designs two methods respectively for the sequence and the set similarity joins. For the sequence similarity join, this thesis proposes to evenly partition the sequences to segments. It is guaranteed that two sequences are similar only if one sequence has a subsequence identical to a segment of another sequence. For the set similarity join, this thesis proposes to partition all the sets into segments based on the universe. This thesis further extends the two partition-based methods to support the large-scale data processing framework, Map-Reduce and Spark. The partition-based method won the string similarity join competition held by EDBT and beat the second place by 10 times. 3. This thesis proposes a pivotal prefix filter technique to solve the sequence similarity search problem. This thesis shows that the pivotal prefix filter has stronger pruning power and less filtering cost compared to the state-of-the-art filters.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset