Sparse and Redundant Representations: From Theory to Applications in Signal and Image Processing / Edition 1

Sparse and Redundant Representations: From Theory to Applications in Signal and Image Processing / Edition 1

by Michael Elad
ISBN-10:
144197010X
ISBN-13:
9781441970107
Pub. Date:
08/19/2010
Publisher:
Springer New York
ISBN-10:
144197010X
ISBN-13:
9781441970107
Pub. Date:
08/19/2010
Publisher:
Springer New York
Sparse and Redundant Representations: From Theory to Applications in Signal and Image Processing / Edition 1

Sparse and Redundant Representations: From Theory to Applications in Signal and Image Processing / Edition 1

by Michael Elad

Hardcover

$99.99 Current price is , Original price is $99.99. You
$99.99 
  • SHIP THIS ITEM
    Qualifies for Free Shipping
  • PICK UP IN STORE
    Check Availability at Nearby Stores
  • SHIP THIS ITEM

    Temporarily Out of Stock Online

    Please check back later for updated availability.


Overview

A long long time ago, echoing philosophical and aesthetic principles that existed since antiquity, William of Ockham enounced the principle of parsimony, better known today as Ockham’s razor: “Entities should not be multiplied without neces sity. ” This principle enabled scientists to select the ”best” physical laws and theories to explain the workings of the Universe and continued to guide scientific research, leading to beautiful results like them inimal description length approachtostatistical inference and the related Kolmogorov complexity approach to pattern recognition. However, notions of complexity and description length are subjective concepts and depend on the language “spoken” when presenting ideas and results. The field of sparse representations, that recently underwent a Big Bang like expansion, explic itly deals with the Yin Yang interplay between the parsimony of descriptions and the “language” or “dictionary” used in them, and it became an extremely exciting area of investigation. It already yielded a rich crop of mathematically pleasing, deep and beautiful results that quickly translated into a wealth of practical engineering applications. You are holding in your hands the first guide book to Sparseland, and I am sure you’ll find in it both familiar and new landscapes to see and admire, as well as ex cellent pointers that will help you find further valuable treasures. Enjoy the journey to Sparseland! Haifa, Israel, December 2009 Alfred M. Bruckstein vii Preface This book was originally written to serve as the material for an advanced one semester (fourteen 2 hour lectures) graduate course for engineering students at the Technion, Israel.

Product Details

ISBN-13: 9781441970107
Publisher: Springer New York
Publication date: 08/19/2010
Edition description: 2010
Pages: 376
Product dimensions: 6.30(w) x 9.20(h) x 1.00(d)

About the Author

Michael Elad has been working at The Technion in Haifa, Israel, since 2003 and is currently an Associate Professor. He is one of the leaders in the field of sparse representations. He does prolific research in mathematical signal processing with more than 60 publications in top ranked journals. He is very well recognized and respected in the scientific community.

Table of Contents

Part I Sparse and Redundant Representations - Theoretical and Numerical Foundations

1 Prologue 3

1.1 Underdetermined Linear Systems 3

1.2 Regularization 4

1.3 The Temptation of Convexity 5

1.4 A Closer Look at l1 Minimization 6

1.5 Conversion of (P1) to Linear Programming 8

1.6 Promoting Sparse Solutions 8

1.7 The l0-Norm and Implications 12

1.8 The (P0) Problem - Our Main Interest 13

1.9 The Signal Processing Perspective 14

Further Reading 14

2 Uniqueness and Uncertainty 17

2.1 Treating the Two-Ortho Case 17

2.1.1 An Uncertainty Principle 18

2.1.2 Uncertainty of Redundant Solutions 21

2.1.3 From Uncertainty to Uniqueness 23

2.2 Uniqueness Analysis for the General Case 23

2.2.1 Uniqueness via the Spark 23

2.2.2 Uniqueness via the Mutual-Coherence 25

2.2.3 Uniqueness via the Babel Function 27

2.2.4 Upper-Bounding the Spark 28

2.3 Constructing Grassmannian Matrices 29

2.4 Summary 30

Further Reading 31

3 Pursuit Algorithms - Practice 35

3.1 Greedy Algorithms 35

3.1.1 The Core Idea 35

3.1.2 The Orthogonal-Matching-Pursuit 36

3.1.3 Other Greedy Methods 39

3.1.4 Normalization 41

3.1.5 Rate of Decay of the Residual in Greedy Methods 43

3.1.6 Thresholding Algorithm 45

3.1.7 Numerical Demonstration of Greedy Algorithms 46

3.2 Convex Relaxation Techniques 48

3.2.1 Relaxation of the l0-Norm 48

3.2.2 Numerical Algorithms for Solving (P1) 51

3.2.3 Numerical Demonstration of Relaxation Methods 51

3.3 Summary 52

Further Reading 53

4 Pursuit Algorithms - Guarantees 55

4.1 Back to the Two-Ortho Case 55

4.1.1 OMP Performance Guarantee 55

4.1.2 BP Performance Guarantee 58

4.2 The General Case 64

4.2.1 OMP Performance Guarantee 65

4.2.2 Thresholding Performance Guarantee 67

4.2.3 BP Performance Guarantee 68

4.2.4 Performance of Pursuit Algorithms - Summary 71

4.3 The Role of the Sign-Pattern 71

4.4 Tropp's Exact Recovery Condition 73

4.5 Summary 76

Further Reading 76

5 From Exact to Approximate Solutions 79

5.1 General Motivation 79

5.2 Stability of the Sparsest Solution 80

5.2.1 Uniqueness versus Stability - Gaining Intuition 80

5.2.2 Theoretical Study of the Stability of (P0ε) 82

5.2.3 The RIP and Its Use for Stability Analysis 86

5.3 Pursuit Algorithms 89

5.3.1 OMP and BP Extensions 89

5.3.2 Iteratively-Reweighed-Least-Squares (IRLS) 91

5.3.3 The LARS Algorithm 95

5.3.4 Quality of Approximations Obtained 98

5.4 The Unitary Case 101

5.5 Performance of Pursuit Algorithms 103

5.5.1 BPDN Stability Guarantee 103

5.5.2 Thresholding Stability Guarantee 104

5.6 Summary 107

Further Reading 108

6 Iterative-Shrinkage Algorithms 111

6.1 Background 111

6.2 The Unitary Case - A Source of Inspiration 112

6.2.1 Shrinkage For the Unitary case 112

6.2.2 The BCR Algorithm and Variations 113

6.3 Developing Iterative-Shrinkage Algorithms 115

6.3.1 Surrogate Functions and the Prox Method 115

6.3.2 EM and Bound-Optimization Approaches 117

6.3.3 An IRLS-Based Shrinkage Algorithm 119

6.3.4 The Parallel-Coordinate-Descent (PCD) Algorithm 120

6.3.5 StOMP: A Variation on Greedy Methods 123

6.3.6 Bottom Line - Iterative-Shrinkage Algorithms 125

6.4 Acceleration Using Line-Search and SESOP 127

6.5 Iterative-Shrinkage Algorithms: Tests 127

6.6 Summary 132

Further Reading 134

7 Towards Average Performance Analysis 137

7.1 Empirical Evidence Revisited 137

7.2 A Glimpse into Probabilistic Analysis 140

7.2.1 The Analysis Goals 140

7.2.2 Two-Ortho Analysis by Candes & Romberg 141

7.2.3 Probabilistic Uniqueness 143

7.2.4 Donoho's Analysis 143

7.2.5 Summary 144

7.3 Average Performance of Thresholding 144

7.3.1 Preliminaries 144

7.3.2 The Analysis 145

7.3.3 Discussion 148

7.4 Summary 150

Further Reading 150

8 The Dantzig-Selector Algorithm 153

8.1 Dantzig-Selector versus Basis-Pursuit 153

8.2 The Unitary Case 155

8.3 Revisiting the Restricted Isometry Machinery 156

8.4 Dantzig-Selector Performance Guaranty 157

8.5 Dantzig-Selector in Practice 163

8.6 Summary 164

Further Reading 165

Part II From Theory to Practice - Signal and Image Processing Applications

9 Sparsity-Seeking Methods in Signal Processing 169

9.1 Priors and Transforms for Signals 169

9.2 The Sparse-Land Model 172

9.3 Geometric Interpretation of Sparse-Land 173

9.4 Processing of Sparsely-Generated Signals 176

9.5 Analysis Versus Synthesis Signal Modeling 178

9.6 Summary 180

Further Reading 181

10 Image Deblurring - A Case Study 185

10.1 Problem Formulation 185

10.2 The Dictionary 186

10.3 Numerical Considerations 188

10.4 Experiment Details and Results 191

10.5 Summary 198

Further Reading 199

11 MAP versus MMSE Estimation 201

11.1 A Stochastic Model and Estimation Goals 201

11.2 Background on MAP and MMSE 202

11.3 The Oracle Estimation 204

11.3.1 Developing the Oracle Estimator 204

11.3.2 The Oracle Error 206

11.4 The MAP Estimation 208

11.4.1 Developing the MAP Estimator 208

11.4.2 Approximating the MAP Estimator 211

11.5 The MMSE Estimation 212

11.5.1 Developing the MMSE Estimator 212

11.5.2 Approximating the MMSE Estimator 215

11.6 MMSE and MAP Errors 218

11.7 More Experimental Results 220

11.8 Summary 224

Further Reading 224

12 The Quest for a Dictionary 227

12.1 Choosing versus Learning 227

12.2 Dictionary-Learning Algorithms 228

12.2.1 Core Questions in Dictionary-Learning 229

12.2.2 The MOD Algorithm 230

12.2.3 The K-SVD Algorithm 231

12.3 Training Structured Dictionaries 237

12.3.1 The Double-Sparsity Model 239

12.3.2 Union of Unitary Bases 241

12.3.3 The Signature Dictionary 242

12.4 Summary 244

Further Reading 244

13 Image Compression - Facial Images 247

13.1 Compression of Facial Images 247

13.2 Previous Work 249

13.3 Sparse-Representation-Based Coding Scheme 250

13.3.1 The General Scheme 251

13.3.2 VQ Versus Sparse Representations 253

13.4 More Details and Results 254

13.4.1 K-SVD Dictionaries 255

13.4.2 Reconstructed Images 255

13.4.3 Run-Time and Memory Usage 260

13.4.4 Comparing to Other Techniques 261

13.4.5 Dictionary Redundancy 262

13.5 Post-Processing for Deblocking 263

13.5.1 The Blockiness Artifacts 263

13.5.2 Possible Approaches For Deblocking 265

13.5.3 Learning-Based Deblocking Approach 266

13.6 Deblocking Results 267

13.7 Summary 268

Further Reading 269

14 Image Denoising 273

14.1 General Introduction - Image Denoising 273

14.2 The Beginning: Global Modeling 274

14.2.1 The Core Image-Denoising Algorithm 274

14.2.2 Various Improvements 276

14.3 From Global to Local Modeling 278

14.3.1 The General Methodology 278

14.3.2 Learning the Shrinkage Curves 279

14.3.3 Learned Dictionary and Globalizing the Prior 286

14.3.4 The Non-Local-Means Algorithm 292

14.3.5 3D-DCT Shrinkage: BM3D Denoising 296

14.4 SURE for Automatic Parameter Setting 297

14.4.1 Development of the SURE 298

14.4.2 Demonstrating SURE to Global-Threhsolding 300

14.5 Summary 303

Further Reading 303

15 Other Applications 309

15.1 General 309

15.2 Image Separation via MCA 310

15.2.1 Image = Cartoon + Texture 310

15.2.2 Global MCA for Image Separation 312

15.2.3 Local MCA for Image Separation 316

15.3 Image Inpainting and Impulsive Noise Removal 324

15.3.1 Inpainting Sparse-Land Signals - Core Principles 324

15.3.2 Inpainting Images - Local K-SVD 327

15.3.3 Inpainting Images - The Global MCA 335

15.3.4 Impulse-Noise Filtering 338

15.4 Image Scale-Up 341

15.4.1 Modeling the Problem 343

15.4.2 The Super-Resolution Algorithm 346

15.4.3 Scaling-Up Results 349

15.4.4 Image Scale-Up: Summary 351

15.5 Summary 353

Further Reading 354

16 Epilogue 359

16.1 What is it All About? 359

16.2 What is Still Missing? 359

16.3 Bottom Line 360

Notation 363

Acronyms 369

Index 371

From the B&N Reads Blog

Customer Reviews