DIGITAL IMAGE PROCESSING BOOKS PDF

adminComment(0)
    Contents:

Digital Image Processing, 2/E is a completely self-contained book. The A database containing images from the book and other educational sources. Completely self-contained—and heavily illustrated—this introduction to basic concepts and methodologies for digital image processing is. these discrete coordinates. In many image processing books, the image origin A digital image can be represented as a MATLAB matrix: f f(1, 1) f(1, 2) f(1, N).


Digital Image Processing Books Pdf

Author:SHEMEKA TOADVINE
Language:English, Dutch, German
Country:Latvia
Genre:Academic & Education
Pages:162
Published (Last):05.07.2016
ISBN:698-8-26593-830-7
ePub File Size:25.85 MB
PDF File Size:8.60 MB
Distribution:Free* [*Register to download]
Downloads:31181
Uploaded by: DENISHA

Enrique Jardiel Poncela This edition is the most comprehensive revision of Digital Image Processing since the book first appeared in As the and . PDF | This book is an attempt to present the advances in digital image processing and analysis in the form of a textbook for both undergraduate. PDF | On Jul 7, , Mahmut Sinecen and others published Digital Image Processing with In book: Applications from Engineering with MATLAB Concepts.

In the previous edition, Chapter 3 was devoted exclusively to image transforms.

One of the major changes in the book is that image transforms are now introduced when they are needed. This allowed us to begin discussion of image processing techniques much earlier than before, fur- ther addressing the second finding of the market survey. Chapters 3 and 4 in the current edition deal with image enhancement, as opposed to a single chapter Chapter 4 in the previous edition. The new organization of this material does not imply that image enhancement is more important than other areas.

Rather, we used it as an avenue to introduce spatial methods for image processing Chapter 3 , as well as the Fourier transform, the frequency domain, and image filtering Chapter 4. Our purpose for introducing these concepts in the context of image enhancement a subject particularly appealing to beginners was to in- crease the level of intuitiveness in the presentation, thus addressing partially the third major finding in the marketing survey.

This organization also gives in- structors flexibility in the amount of frequency-domain material they wish to cover. Chapter 5 also was rewritten completely in a more intuitive manner. The coverage of this topic in earlier editions of the book was based on matrix theory. Although unified and elegant, this type of presentation is difficult to follow, particularly by undergraduates. The new presentation covers essentially the same ground, but the discussion does not rely on matrix theory and is much easier to understand, due in part to numerous new examples.

The price paid for this newly gained simplicity is the loss of a unified approach, in the sense that in the earlier treatment a number of restoration results could be derived from one basic formulation. On balance, however, we believe that readers especial- ly beginners will find the new treatment much more appealing and easier to fol- low. Also, as indicated below, the old material is stored in the book Web site for easy access by individuals preferring to follow a matrix-theory formulation.

Chapter 6 dealing with color image processing is new. Interest in this area has increased significantly in the past few years as a result of growth in the use of digital images for Internet applications. Our treatment of this topic represents a significant expansion of the material from previous editions.

Similarly Chap- ter 7, dealing with wavelets, is new.

If You're an Educator

In addition to a number of signal process- ing applications, interest in this area is motivated by the need for more sophisticated methods for image compression, a topic that in turn is motivated by a increase in the number of images transmitted over the Internet or stored in Web servers. Chapter 8 dealing with image compression was updated to in- clude new compression methods and standards, but its fundamental structure remains the same as in the previous edition. Several image transforms, previously covered in Chapter 3 and whose principal use is compression, were moved to this chapter.

It is based on a signifi- cant expansion of the material previously included as a section in the chapter on image representation and description. Chapter 10, dealing with image seg- mentation, has the same basic structure as before, but numerous new examples were included and a new section on segmentation by morphological watersheds was added.

Chapter 11, dealing with image representation and description, was shortened slightly by the removal of the material now included in Chapter 9. New examples were added and the Hotelling transform description by princi- pal components , previously included in Chapter 3, was moved to this chapter. Chapter 12 dealing with object recognition was shortened by the removal of topics dealing with knowledge-based image analysis, a topic now covered in considerable detail in a number of books which we reference in Chapters 1 and Experience since the last edition of Digital Image Processing indicates that the new, shortened coverage of object recognition is a logical place at which to conclude the book.

Although the book is totally self-contained, we have established a compan- ion web site see inside front cover designed to provide support to users of the book. For students following a formal course of study or individuals embarked on a program of self study, the site contains a number of tutorial reviews on background material such as probability, statistics, vectors, and matrices, pre- pared at a basic level and written using the same notation as in the book.

Detailed solutions to many of the exercises in the book also are provided. For instruction, the site contains suggested teaching outlines, classroom presentation materials, laboratory experiments, and various image databases including most images from the book. In addition, part of the material removed from the pre- vious edition is stored in the Web site for easy download and classroom use, at the discretion of the instructor.

This edition of Digital Image Processing is a reflection of the significant progress that has been made in this field in just the past decade. As is usual in a project such as this, progress continues after work on the manuscript stops. One of the reasons earlier versions of this book have been so well accepted through- out the world is their emphasis on fundamental concepts, an approach that, among other things, attempts to provide a measure of constancy in a rapidly- evolving body of knowledge.

We have tried to observe that same principle in preparing this edition of the book. II Anonymous Preview Interest in digital image processing methods stems from two principal applica- tion areas: This chapter has several objectives: An image may be defined as a two-dimensional function, f x, y , where x and y are spatial plane coordinates, and the amplitude of f at any pair of coordi- nates x, y is called the intensity or gray level of the image at that point.

When x, y, and the amplitude values of f are all finite, discrete quantities, we call the image a digital image. The field of digital image processing refers to processing digital images by means of a digital computer.

0366.4520.01 Mathematical Methods for Digital Image Processing

These elements are referred to as picture elements, image elements, pels, and pixels. Pixel is the term most widely used to denote the elements of a digi- tal image. We consider these definitions in more formal terms in Chapter 2. Vision is the most advanced of our senses, so it is not surprising that images play the single most important role in human perception.

However, unlike humans, who are limited to the visual band of the electromagnetic EM spec- trum, imaging machines cover almost the entire EM spectrum, ranging from gamma to radio waves. They can operate on images generated by sources that humans are not accustomed to associating with images.

These include ultra- sound, electron microscopy, and computer-generated images. Thus, digital image processing encompasses a wide and varied field of applications. There is no general agreement among authors regarding where image pro- cessing stops and other related areas, such as image analysis and computer vi- sion, start.

Sometimes a distinction is made by defining image processing as a discipline in which both the input and output of a process are images. We believe this to be a limiting and somewhat artificial boundary.

For example, under this definition, even the trivial task of computing the average intensity of an image which yields a single number would not be considered an image processing op- eration. On the other hand, there are fields such as computer vision whose ul- timate goal is to use computers to emulate human vision, including learning and being able to make inferences and take actions based on visual inputs.

This area itself is a branch of artificial intelligence AI whose objective is to emu- late human intelligence. The field of AI is in its earliest stages of infancy in terms of development, with progress having been much slower than originally antic- ipated.

The area of image analysis also called image understanding is in be- tween image processing and computer vision. There are no clear-cut boundaries in the continuum from image processing at one end to computer vision at the other. However, one useful paradigm is to consider three types of computerized processes in this continuum: Low-level processes involve primitive opera- tions such as image preprocessing to reduce noise, contrast enhancement, and image sharpening.

A low-level process is characterized by the fact that both its inputs and outputs are images. Mid-level processing on images involves tasks such as segmentation partitioning an image into regions or objects , description of those objects to reduce them to a form suitable for computer processing, and classification recognition of individual objects. A mid-level process is characterized by the fact that its inputs generally are images, but its outputs are attributes extracted from those images e.

Based on the preceding comments, we see that a logical place of overlap be- tween image processing and image analysis is the area of recognition of indi- vidual regions or objects in an image. As a simple illustration to clarify these concepts, consider the area of automated analysis of text. The processes of acquiring an image of the area containing the text, preprocessing that image, extracting segmenting the individual characters, describing the characters in a form suitable for computer processing, and recognizing those individual characters are in the scope of what we call digital image processing in this book.

The con- cepts developed in the following chapters are the foundation for the methods used in those application areas.

Introduction of the Bartlane cable picture transmission system in the early s reduced the time required to transport a picture across the Atlantic from more than a week to less than three hours. Specialized printing equipment coded pictures for cable transmission and then reconstructed them at the re- ceiving end. Figure 1.

Some of the initial problems in improving the visual quality of these early dig- ital pictures were related to the selection of printing procedures and the distri- bution of intensity levels. The printing method used to obtain Fig. The improve- ments over Fig. Some errors are visible. The early Bartlane systems were capable of coding images in five distinct levels of gray.

This capability was increased to 15 levels in During this period, introduction of a system for developing a film plate via light beams that were modulated by the coded picture tape improved the reproduc- tion process considerably. Although the examples just cited involve digital images, they are not con- sidered digital image processing results in the context of our definition because computers were not involved in their creation.

Thus, the history of digital image processing is intimately tied to the development of the digital computer. In fact, digital images require so much storage and computational power that progress in the field of digital image processing has been dependent on the development of digital computers and of supporting technologies that include data storage, display, and transmission.

The idea of a computer goes back to the invention of the abacus in Asia Minor, more than years ago. More recently, there were developments in the past two centuries that are the foundation of what we call a computer today. However, the basis for what we call a modern digital computer dates back to only the s with the introduction by John von Neumann of two key concepts: These two ideas are the foundation of a central processing unit CPU , which is at the heart of computers today.

Briefly, these advances may be summarized as follows: Concurrent with these advances were developments in the areas of mass storage and display systems, both of which are fundamental requirements for digital image processing. The first computers powerful enough to carry out meaningful image pro- cessing tasks appeared in the early s. The birth of what we call digital image processing today can be traced to the availability of those machines and the onset of the space program during that period.

It took the combination of those two developments to bring into focus the potential of digital image processing concepts. Work on using computer techniques for improving images from a space probe began at the Jet Propulsion Laboratory Pasadena, California in when pictures of the moon transmitted by Ranger 7 were processed by a computer to correct various types of image distortion inherent in the on-board television camera. Eastern Daylight Time EDT , about 17 minutes before impacting the lunar surface the markers, called reseau marks, are used for geometric corrections, as discussed in Chapter 5.

This also is the first image of the moon taken by a U. The imaging lessons learned with Ranger 7 served as the basis for improved methods used to enhance and restore images from the Surveyor missions to the moon, the Mariner series of flyby missions to Mars, the Apollo manned flights to the moon, and others.

Ranger 7 took this image on July 31, at 9: EDT, about 17 minutes before impacting the lunar surface. Courtesy of NASA.

The invention in the early s of comput- erized axial tomography CAT , also called computerized tomography CT for short, is one of the most important events in the application of image processing in medical diagnosis. Computerized axial tomography is a process in which a ring of detectors encircles an object or patient and an X-ray source, concentric with the detector ring, rotates about the object.

The X-rays pass through the object and are collected at the opposite end by the corresponding detectors in the ring.

As the source rotates, this procedure is repeated. Motion of the object in a direction perpendicular to the ring of detectors pro- duces a set of such slices, which constitute a three-dimensional 3-D rendition of the inside of the object. Tomography was invented independently by Sir Godfrey N. Hounsfield and Professor Allan M.

Cormack, who shared the Nobel Prize in Medicine for their invention. It is interesting to note that X-rays were discov- ered in by Wilhelm Conrad Roentgen, for which he received the Nobel Prize for Physics. These two inventions, nearly years apart, led to some of the most active application areas of image processing today.

From the s until the present, the field of image processing has grown vig- orously. In addition to applications in medicine and the space program, digital image processing techniques now are used in a broad range of applications. Com- puter procedures are used to enhance the contrast or code the intensity levels into color for easier interpretation of X-rays and other images used in industry, medi- cine, and the biological sciences. Geographers use the same or similar techniques to study pollution patterns from aerial and satellite imagery.

Image enhancement and restoration procedures are used to process degraded images of unrecoverable objects or experimental results too expensive to duplicate. In archeology, image processing methods have successfully restored blurred pictures that were the only available records of rare artifacts lost or damaged after being photographed.

In physics and related fields, computer techniques routinely enhance images of ex- periments in areas such as high-energy plasmas and electron microscopy. Similar- ly successful applications of image processing concepts can be found in astronomy, biology, nuclear medicine, law enforcement, defense, and industrial applications. These examples illustrate processing results intended for human interpreta- tion.

The second major area of application of digital image processing techniques mentioned at the beginning of this chapter is in solving problems dealing with machine perception. In this case, interest focuses on procedures for extracting from an image information in a form suitable for computer processing. Often, this information bears little resemblance to visual features that humans use in interpreting the content of an image.

Examples of the type of information used in machine perception are statistical moments, Fourier transform coefficients, and multidimensional distance measures. The continuing decline in the ratio of computer price to performance and the expansion of networking and commu- nication bandwidth via the World Wide Web and the Internet have created un- precedented opportunities for continued growth of digital image processing.

Some of these application areas are illustrated in the following section. We can cover only a few of these appli- cations in the context and space of the current discussion. We show in this section numerous areas of application, each of which routinely uti- lizes the digital image processing techniques developed in the following chap- ters. Many of the images shown in this section are used later in one or more of the examples given in the book.

All images shown are digital. The areas of application of digital image processing are so varied that some form of organization is desirable in attempting to capture the breadth of this field.

One of the simplest ways to develop a basic understanding of the extent of image processing applications is to categorize images according to their source e. The principal energy source for images in use today is the electromagnetic energy spectrum. Other important sources of energy in- clude acoustic, ultrasonic, and electronic in the form of electron beams used in electron microscopy.

Synthetic images, used for modeling and visualization, are generated by computer.

In this section we discuss briefly how images are gener- ated in these various categories and the areas in which they are applied. Meth- ods for converting images into digital form are discussed in the next chapter. Images based on radiation from the EM spectrum are the most familiar, es- pecially images in the X-ray and visual bands of the spectrum.

Electromagnet- ic waves can be conceptualized as propagating sinusoidal waves of varying wavelengths, or they can be thought of as a stream of massless particles, each traveling in a wavelike pattern and moving at the speed of light. Each massless particle contains a certain amount or bundle of energy. Each bundle of ener- gy is called a photon. If spectral bands are grouped according to energy per photon, we obtain the spectrum shown in Fig. The bands are shown shaded to convey the fact that bands of the EM spectrum are not distinct but rather transition smoothly from one to the other.

In nuclear medicine, the approach is to inject a pa- tient with a radioactive isotope that emits gamma rays as it decays. Images are produced from the emissions collected by gamma ray detectors. Images of this sort are used to locate sites of bone pathology, such as in- fections or tumors.

Images courtesy of a G. Medical Systems, b Dr. Michael E. Wehe, University of Michigan. However, instead of using an external source of X-ray energy, the patient is given a radioactive iso- tope that emits positrons as it decays. When a positron meets an electron, both are annihilated and two gamma rays are given off.

These are detected and a to- mographic image is created using the basic principles of tomography. The image shown in Fig. This image shows a tumor in the brain and one in the lung, easily visible as small white masses.

A star in the constellation of Cygnus exploded about 15, years ago, gen- erating a superheated stationary gas cloud known as the Cygnus Loop that glows in a spectacular array of colors. Unlike the two examples shown in Figs. Finally, Fig. An area of strong radiation is seen in the lower, left side of the image.

The best known use of X-rays is medical diagnostics, but they also are used exten- sively in industry and other areas, like astronomy. X-rays for medical and in- dustrial imaging are generated using an X-ray tube, which is a vacuum tube with a cathode and anode. The cathode is heated, causing free electrons to be released.

These electrons flow at high speed to the positively charged anode. When the electrons strike a nucleus, energy is released in the form of X-ray ra- diation. The energy penetrating power of the X-rays is controlled by a volt- age applied across the anode, and the number of X-rays is controlled by a current applied to the filament in the cathode. The intensity of the X-rays is modified by absorption as they pass through the patient, and the resulting energy falling on the film de- velops it, much in the same way that light develops photographic film.

In digi- tal radiography, digital images are obtained by one of two methods: The light signal in turn is captured by a light-sensitive digitizing system.

We discuss digitization in detail in Chapter 2. Angiography is another major application in an area called contrast- enhancement radiography. This procedure is used to obtain images called angiograms of blood vessels. A catheter a small, flexible, hollow tube is in- serted, for example, into an artery or vein in the groin. The catheter is thread- ed into the blood vessel and guided to the area to be studied. When the catheter reaches the site under investigation, an X-ray contrast medium is injected through the catheter.

This enhances contrast of the blood vessels and enables the radiologist to see any irregularities or blockages. The catheter can be seen being inserted into the large blood vessel on the lower left of the picture. Images courtesy of a and c Dr. Pickens, Dept. Thomas R. Joseph E. Pascente, Lixi, Inc. As discussed in Chapter 3, angiography is a major area of digital image processing, where image subtraction is used to en- hance further the blood vessels being studied.

Perhaps the best known of all uses of X-rays in medical imaging is comput- erized axial tomography. Due to their resolution and 3-D capabilities, CAT scans revolutionized medicine from the moment they first became available in the early s. As noted in Section 1. Numerous slices are generated as the patient is moved in a longitudinal direction. The ensemble of such images constitutes a 3-D rendition of the inside of the patient, with the longitudinal resolution being proportional to the number of slice images taken.

Techniques similar to the ones just discussed, but generally involving higher- energy X-rays, are applicable in industrial processes.

Such images, representative of lit- erally hundreds of industrial applications of X-rays, are used to examine circuit boards for flaws in manufacturing, such as missing components or broken traces. Industrial CAT scans are useful when the parts can be penetrated by X-rays, such as in plastic assemblies, and even large bodies, like solid-propellant rock- et motors. This image is the Cygnus Loop of Fig. They include lithography, indus- trial inspection, microscopy, lasers, biological imaging, and astronomical obser- vations.

We illustrate imaging in this band with examples from microscopy and astronomy. Ultraviolet light is used in fluorescence microscopy, one of the fastest grow- ing areas of microscopy.

This page is no longer available.

Fluorescence is a phenomenon discovered in the mid- dle of the nineteenth century, when it was first observed that the mineral fluorspar fluoresces when ultraviolet light is directed upon it. The ultraviolet light itself is not visible, but when a photon of ultraviolet radiation collides with an electron in an atom of a fluorescent material, it elevates the electron to a higher energy level.

Subsequently, the excited electron relaxes to a lower level and emits light in the form of a lower-energy photon in the visible red light re- gion. The basic task of the fluorescence microscope is to use an excitation light to irradiate a prepared specimen and then to separate the much weaker radi- ating fluorescent light from the brighter excitation light. Thus, only the emission light reaches the eye or other detector.

The resulting fluorescing areas shine against a dark background with sufficient contrast to permit detection. The darker the background of the nonfluorescing material, the more efficient the instrument. Fluorescence microscopy is an excellent method for studying materials that can be made to fluoresce, either in their natural form primary fluorescence or when treated with chemicals capable of fluorescing secondary fluorescence.

Figures 1. Images courtesy of a and b Dr. Michael W. Corn smut is particularly harmful because corn is one of the principal food sources in the world. As another illustration, Fig. We consider in the following discussion applications in light microscopy, astronomy, remote sensing, industry, and law enforcement. The examples range from pharmaceuticals and microinspection to materials characterization. Even in just microscopy, the application areas are too numer- ous to detail here.

It is not difficult to conceptualize the types of processes one might apply to these images, ranging from enhancement to measurements. Images cour- tesy of Dr. Davidson, Florida State University. Table 1.

The primary function of LANDSAT is to obtain and transmit images of the Earth from space, for purposes of monitoring environmental conditions on the planet. Note the characteristics and uses of each band. In order to develop a basic appreciation for the power of this type of multi- spectral imaging, consider Fig.

The numbers refer to the thematic bands in Table 1. Images courtesy of NASA. Courtesy of NOAA. The area imaged is Washington D. Images of population centers are used routinely over time to assess population growth and shift patterns, pollution, and other factors harm- ful to the environment.

The differences between visual and infrared image fea- tures are quite noticeable in these images. Observe, for example, how well defined the river is from its surroundings in Bands 4 and 5. Weather observation and prediction also are major applications of multi- spectral imaging from satellites. For example, Fig. The eye of the hurricane is clearly visible in this image. These images are part of the Nighttime Lights of the World data set, which provides a glob- al inventory of human settlements.

The infrared imaging system operates in the band Even without formal training in image process- ing, it is not difficult to imagine writing a computer program that would use these images to estimate the percent of total electrical energy used by various regions of the world.

The small gray map is provided for reference. A major area of imaging in the visual spectrum is in automated visual inspec- tion of manufactured goods. A typical image processing task with products like this is to inspect them for missing parts the black square on the top, right quadrant of the image is an example of a missing component. The objective here is to have a machine look for miss- ing pills. Detecting anomalies like these is a major theme of industrial inspection that includes other products such as wood and cloth.

Most of the other small speckle de- tail is debris. The objective in this type of inspection is to find damaged or incor- rectly manufactured implants automatically, prior to packaging. As a final illustration of image processing in the visual spectrum, consider Fig. Images of fingerprints are routinely processed by computer, either to enhance them or to find features that aid in the automated search of a database for potential matches.

Applications of digital image processing in this area include automated counting and, in law enforcement, the reading of the serial number for the purpose of tracking and identifying bills.

The two vehicle images shown in Figs. Pete Sites, Perceptics Corporation. The light rectangles indicate the area in which the imaging system detected the plate.

The black rectangles show the results of automated reading of the plate content by the system. License plate and other applications of character recog- nition are used extensively for traffic monitoring and surveillance. The unique feature of imaging radar is its ability to collect data over virtually any region at any time, regardless of weather or ambient lighting conditions. Automated license plate reading.

Figure a courtesy of the National Institute of Standards and Technology. Figures c and d courtesy of Dr. Juan Herrera, Perceptics Corporation. An imaging radar works like a flash camera in that it provides its own illumination microwave pulses to il- luminate an area on the ground and take a snapshot image. Instead of a cam- era lens, a radar uses an antenna and digital computer processing to record its images.

In a radar image, one can see only the microwave energy that was re- flected back toward the radar antenna. In the lower right corner is a wide valley of the Lhasa River, which is populated by Tibetan farmers and yak herders and includes the village of Menba.

Mountains in this area reach about m 19, ft above sea level, while the valley floors lie about m 14, ft above sea level. Note the clarity and detail of the image, unencumbered by clouds or other atmospheric conditions that normally inter- fere with images in the visual band. In medicine radio waves are used in magnetic resonance imaging MRI. This technique places a patient in a powerful magnet and passes radio waves through his or her body in short pulses.

The location from which these signals originate and their strength are determined by a computer, which produces a two-dimensional picture of a section of the patient. MRI can produce pictures in any plane. The last image to the right in Fig. Also shown for an interesting comparison are images of the same region but taken in most of the bands discussed earlier. Specifically, we discuss in this section acoustic imaging, electron microscopy, and synthetic computer-generated imaging.

Geological applications use sound in the low end of the sound spec- trum hundreds of Hertz while imaging in other areas use ultrasound millions of Hertz. The most important commercial applications of image processing in geology are in mineral and oil exploration. For image acquisition over land, one of the main approaches is to use a large truck and a large flat steel plate. Image a courtesy of Dr. David R. The strength and speed of the returning sound waves are determined by the composition of the earth below the surface.

These are analyzed by computer, and images are generated from the resulting analysis. For marine acquisition, the energy source consists usually of two air guns towed behind a ship.

Returning sound waves are detected by hydrophones placed in cables that are either towed behind the ship, laid on the bottom of the ocean, or hung from buoys vertical cables. The constant motion of the ship pro- vides a transversal direction of motion that, together with the returning sound waves, is used to generate a 3-D map of the composition of the Earth below the bottom of the ocean.

This target is brighter than the sur- rounding layers because of the change in density in the target region is larger. Courtesy of Dr. Curtis Ober, Sandia National Laboratories.

The layers above also are bright, but their brightness does not vary as strongly across the layers. Many seismic reconstruction algorithms have difficulty imaging this tar- get because of the faults above it.

Although ultrasound imaging is used routinely in manufacturing, the best known applications of this technique are in medicine, especially in obstetrics, where unborn babies are imaged to determine the health of their development. A byproduct of this examination is determining the sex of the baby. Ultrasound images are generated using the following basic procedure: The ultrasound system a computer, ultrasound probe consisting of a source and receiver, and a display transmits high-frequency 1 to 5 MHz sound pulses into the body.

The sound waves travel into the body and hit a boundary between tissues e. Some of the sound waves are reflected back to the probe, while some travel on further until they reach another boundary and get reflected. The reflected waves are picked up by the probe and relayed to the computer.

The system displays the distances and intensities of the echoes on the screen, forming a two-dimensional image. In a typical ultrasound image, millions of pulses and echoes are sent and re- ceived each second. The probe can be moved along the surface of the body and angled to obtain various views. We continue the discussion on imaging modalities with some examples of electron microscopy. Electron microscopes function as their optical counter- parts, except that they use a focused beam of electrons instead of light to image a specimen.

The operation of electron microscopes involves the following basic steps: A stream of electrons is produced by an electron source and accelerated toward the specimen using a positive electrical potential. Courtesy of Siemens Medical Systems, Inc.

This beam is focused onto the sample using a mag- netic lens. Interactions occur inside the irradiated sample, affecting the electron beam. These interactions and effects are detected and transformed into an image, much in the same way that light is reflected from, or absorbed by, objects in a scene.

These basic steps are carried out in all electron microscopes, re- gardless of type. A transmission electron microscope TEM works much like a slide projec- tor. A projector shines transmits a beam of light through the slide; as the light passes through the slide, it is affected by the contents of the slide.

This trans- mitted beam is then projected onto the viewing screen, forming an enlarged image of the slide. TEMs work the same way, except that they shine a beam of electrons through a specimen analogous to the slide.

The fraction of the beam transmitted through the specimen is projected onto a phosphor screen. The in- teraction of the electrons with the phosphor produces light and, therefore, a viewable image. A scanning electron microscope SEM , on the other hand, ac- tually scans the electron beam and records the interaction of beam and sample at each location. This produces one dot on a phosphor screen.

A complete image is formed by a raster scan of the bean through the sample, much like a TV cam- era. The electrons interact with a phosphor screen and produce light.

Electron microscopes are capable of very high magnification. The white fibers are oxides re- sulting from thermal destruction. Figure a courtesy of Mr.

We conclude the discussion of imaging modalities by looking briefly at im- ages that are not obtained from physical objects.

Instead, they are generated by computer. Fractals are striking examples of computer-generated images Lu []. Basically, a fractal is nothing more than an iterative reproduction of a basic pattern according to some mathematical rules. For instance, tiling is one of the simplest ways to generate a fractal image. A square can be subdivided into four square subregions, each of which can be further subdivided into four small- er square regions, and so on. Depending on the complexity of the rules for fill- ing each subsquare, some beautiful tile images can be generated using this method.

Of course, the geometry can be arbitrary. For instance, the fractal image could be grown radially out of a center point.

The reader will recognize this image as the theme image used in the beginning page of each chapter in this book, selected because of its artis- tic simplicity and abstract analogy to a human eye. They are useful sometimes as random textures. A more structured approach to image generation by comput- er lies in 3-D modeling. This is an area that provides an important intersection between image processing and computer graphics and is the basis for many 3-D visualization systems e.

Since the original object is created in 3-D, images can be generated in any perspective from plane projections of the 3-D volume. Images of this type can be used for medical training and for a host of other applications, such as criminal forensics and special effects.

Figures a and b courtesy of Ms. Melissa D. This organization is summarized in Fig. The diagram does not imply that every process is applied to an image. Rather, the intention is to convey an idea of all the methodologies that can be applied to images for different purposes and possibly with different objectives.

The discussion in this section may be viewed as a brief overview of the mater- ial in the remainder of the book. Image acquisition is the first process shown in Fig. The discussion in Section 1. This topic is considered in much more detail in Chapter 2, where we also introduce a num- ber of basic digital image concepts that are used throughout the book.

Note that acquisition could be as simple as being given an image that is already in dig- ital form. Generally, the image acquisition stage involves preprocessing, such as scaling.

Image enhancement is among the simplest and most appealing areas of dig- ital image processing. Basically, the idea behind enhancement techniques is to bring out detail that is obscured, or simply to highlight certain features of interest in an image.

Two chapters are de- voted to enhancement, not because it is more important than the other topics covered in the book but because we use enhancement as an avenue to introduce the reader to techniques that are used in other chapters as well.

Thus, rather than having a chapter dedicated to mathematical preliminaries, we introduce a number of needed mathematical concepts by showing how they apply to en- hancement. This approach allows the reader to gain familiarity with these con- cepts in the context of image processing.

A good example of this is the Fourier transform, which is introduced in Chapter 4 but is used also in several of the other chapters. Image restoration is an area that also deals with improving the appearance of an image. However, unlike enhancement, which is subjective, image restora- tion is objective, in the sense that restoration techniques tend to be based on mathematical or probabilistic models of image degradation.

Color image processing is an area that has been gaining in importance be- cause of the significant increase in the use of digital images over the Internet. Chapter 5 covers a number of fundamental concepts in color models and basic color processing in a digital domain. Color is used also in later chapters as the basis for extracting features of interest in an image.

Wavelets are the foundation for representing images in various degrees of resolution. In particular, this material is used in this book for image data com- pression and for pyramidal representation, in which images are subdivided suc- cessively into smaller regions. Al- though storage technology has improved significantly over the past decade, the same cannot be said for transmission capacity.

This is true particularly in uses of the Internet, which are characterized by significant pictorial content. Image compression is familiar perhaps inadvertently to most users of computers in the form of image file extensions, such as the jpg file extension used in the JPEG Joint Photographic Experts Group image compression standard. Morphological processing deals with tools for extracting image components that are useful in the representation and description of shape. The material in this chapter begins a transition from processes that output images to processes that output image attributes, as indicated in Section 1.

Segmentation procedures partition an image into its constituent parts or ob- jects. In general, autonomous segmentation is one of the most difficult tasks in digital image processing. A rugged segmentation procedure brings the process a long way toward successful solution of imaging problems that require objects to be identified individually.

On the other hand, weak or erratic segmentation algorithms almost always guarantee eventual failure. In general, the more ac- curate the segmentation, the more likely recognition is to succeed. Representation and description almost always follow the output of a seg- mentation stage, which usually is raw pixel data, constituting either the bound- ary of a region i.

In either case, converting the data to a form suitable for computer processing is necessary. The first decision that must be made is whether the data should be represented as a boundary or as a com- plete region. Boundary representation is appropriate when the focus is on ex- ternal shape characteristics, such as corners and inflections. Regional representation is appropriate when the focus is on internal properties, such as texture or skeletal shape.

In some applications, these representations comple- ment each other. Choosing a representation is only part of the solution for trans- forming raw data into a form suitable for subsequent computer processing.

A method must also be specified for describing the data so that features of inter- est are highlighted. Description, also called feature selection, deals with extract- ing attributes that result in some quantitative information of interest or are basic for differentiating one class of objects from another.

Recognition is the process that assigns a label e. As detailed in Section 1. So far we have said nothing about the need for prior knowledge or about the interaction between the knowledge base and the processing modules in Fig. Knowledge about a problem domain is coded into an image process- ing system in the form of a knowledge database. This knowledge may be as sim- ple as detailing regions of an image where the information of interest is known to be located, thus limiting the search that has to be conducted in seeking that information.

In addition to guiding the operation of each processing module, the knowledge base also controls the interaction between modules.

This distinction is made in Fig. Although we do not discuss image display explicitly at this point, it is impor- tant to keep in mind that viewing the results of image processing can take place at the output of any stage in Fig. We also note that not all image processing applications require the complexity of interactions implied by Fig.

In fact, not even all those modules are needed in some cases. For example, image enhance- ment for human visual interpretation seldom requires use of any of the other stages in Fig. In general, however, as the complexity of an image processing task increases, so does the number of processes required to solve the problem. Late in the s and early in the s, the market shifted to image processing hardware in the form of sin- gle boards designed to be compatible with industry standard buses and to fit into engineering workstation cabinets and personal computers.

In addition to low- ering costs, this market shift also served as a catalyst for a significant number of new companies whose specialty is the development of software written specif- ically for image processing.

Although large-scale image processing systems still are being sold for mas- sive imaging applications, such as processing of satellite images, the trend con- tinues toward miniaturizing and blending of general-purpose small computers with specialized image processing hardware.

The function of each component is discussed in the following paragraphs, starting with image sensing. With reference to sensing, two elements are required to acquire digital im- ages. The first is a physical device that is sensitive to the energy radiated by the object we wish to image.

The second, called a digitizer, is a device for convert- ing the output of the physical sensing device into digital form. For instance, in a digital video camera, the sensors produce an electrical output proportional to light intensity. The digitizer converts these outputs to digital data. These top- ics are covered in some detail in Chapter 2. Specialized image processing hardware usually consists of the digitizer just mentioned, plus hardware that performs other primitive operations, such as an arithmetic logic unit ALU , which performs arithmetic and logical operations in parallel on entire images.

One example of how an ALU is used is in averag- ing images as quickly as they are digitized, for the purpose of noise reduction. Image displays Computer Mass storage Specialized Image processing Hardcopy image processing software hardware Image sensors Problem domain distinguishing characteristic is speed. In other words, this unit performs functions that require fast data throughputs e. The computer in an image processing system is a general-purpose computer and can range from a PC to a supercomputer.

In dedicated applications, some- times specially designed computers are used to achieve a required level of per- formance, but our interest here is on general-purpose image processing systems. In these systems, almost any well-equipped PC-type machine is suitable for off- line image processing tasks.

Software for image processing consists of specialized modules that perform specific tasks. A well-designed package also includes the capability for the user to write code that, as a minimum, utilizes the specialized modules.

More so- phisticated software packages allow the integration of those modules and gen- eral-purpose software commands from at least one computer language. Mass storage capability is a must in image processing applications. When dealing with thousands, or even millions, of images, providing adequate storage in an image processing system can be a challenge.

Storage is measured in bytes eight bits , Kbytes one thousand bytes , Mbytes one mil- lion bytes , Gbytes meaning giga, or one billion, bytes , and Tbytes meaning tera, or one trillion, bytes. One method of providing short-term storage is computer memory. Another is by specialized boards, called frame buffers, that store one or more images and can be accessed rapidly, usually at video rates e.

The latter method allows virtually instantaneous image zoom, as well as scroll vertical shifts and pan horizontal shifts. Frame buffers usually are housed in the specialized image processing hardware unit shown in Fig. On- line storage generally takes the form of magnetic disks or optical-media stor- age.

The key factor characterizing on-line storage is frequent access to the stored data. Finally, archival storage is characterized by massive storage requirements but infrequent need for access. Image displays in use today are mainly color preferably flat screen TV mon- itors. Monitors are driven by the outputs of image and graphics display cards that are an integral part of the computer system. Seldom are there requirements for image display applications that cannot be met by display cards available com- mercially as part of the computer system.

In some cases, it is necessary to have stereo displays, and these are implemented in the form of headgear containing two small displays embedded in goggles worn by the user. Hardcopy devices for recording images include laser printers, film cam- eras, heat-sensitive devices, inkjet units, and digital units, such as optical and CD-ROM disks. Film provides the highest possible resolution, but paper is the obvious medium of choice for written material.

For presentations, images are dis- played on film transparencies or in a digital medium if image projection equip- ment is used. The latter approach is gaining acceptance as the standard for image presentations. Networking is almost a default function in any computer system in use today. Because of the large amount of data inherent in image processing applications, the key consideration in image transmission is bandwidth.

In dedicated net- works, this typically is not a problem, but communications with remote sites via the Internet are not always as efficient. Fortunately, this situation is improving quickly as a result of optical fiber and other broadband technologies.

Summary The main purpose of the material presented in this chapter is to provide a sense of per- spective about the origins of digital image processing and, more important, about cur- rent and future areas of application of this technology. Although the coverage of these topics in this chapter was necessarily incomplete due to space limitations, it should have left the reader with a clear impression of the breadth and practical scope of digital image processing.

Upon concluding the study of the final chapter, the reader of this book will have arrived at a level of understanding that is the foundation for most of the work currently underway in this field. References and Further Reading References at the end of later chapters address specific topics discussed in those chap- ters, and are keyed to the Bibliography at the end of the book.

However, in this chapter we follow a different format in order to summarize in one place a body of journals that publish material on image processing and related topics.

We also provide a list of books from which the reader can readily develop a historical and current perspective of activ- ities in this field. Thus, the reference material cited in this chapter is intended as a general- purpose, easily accessible guide to the published literature on image processing.

Major refereed journals that publish articles on image processing and related topics include: The following books, listed in reverse chronological order with the number of books being biased toward more recent publications , contain material that complements our treatment of digital image processing.

These books represent an easily accessible overview of the area for the past 30 years and were selected to provide a variety of treat- ments. They range from textbooks, which cover foundation material; to handbooks, which give an overview of techniques; and finally to edited books, which contain material rep- resentative of current research in the field. Duda, R. Pattern Classification, 2nd ed. Ritter, G. Shapiro, L. Dougherty, E. Etienne, E. Goutsias, J, Vincent, L. Mallot, A. Marchand-Maillet, S.

Binary Digital Image Processing: Edelman, S. Lillesand, T.

Mather, P. Computer Processing of Remotely Sensed Images: Petrou, M. Image Processing: Russ, J. The Image Processing Handbook, 3rd ed. Smirnov, A. Sonka, M. Umbaugh, S. Computer Vision and Image Processing: Haskell, B. Digital Pictures: Jahne, B. Digital Image Processing: Castleman, K. Digital Image Processing, 2nd ed. Geladi, P. Bracewell, R. Sid-Ahmed, M. Jain, R. Mitiche, A. Baxes, G. Gonzalez, R. Haralick, R. Computer and Robot Vision, vols. Pratt, W. Lim, J. Schalkoff, R. Giardina, C. Serra, J.

Ballard, D. Fu, K. Nevatia, R. Pavlidis, T. Rosenfeld, R. Digital Picture Processing, 2nd ed. Hall, E. Syntactic Pattern Recognition: Andrews, H. Tou, J. Aristotle Preview The purpose of this chapter is to introduce several concepts related to digital im- ages and some of the notation used throughout the book. Section 2. Additional topics discussed in that section include digital image representation, the effects of varying the number of samples and gray levels in an image, some important phenomena associated with sampling, and techniques for image zooming and shrinking.

Karbonn Mobile s 2. Blogger Comments Facebook Comments. Today I'm going to share with you a important book for electronics and electrical engineers and also for mechanical engineers. A simple image formation model, image sampling and quantization, basic relationships between pixels p.

Please refer to the Control Panel Uninstall section below. Solution manual of Introduction to Probability Mod So lets download this book. For courses in Image Processing and Computer Vision. Mordern digital and analog communication systems As in this modern world digital image processing has great value, so i think that everyone should have some knowledge of digital image processing.

Made in India products and goods. Friday, 9 October For more details and other Windows versions click here. Academic Calendar For B. Completely self-contained--and heavily illustrated--this introduction to basic concepts and methodologies for digital image processing is written at a level that truly is suitable for seniors and first-year graduate students in almost any technical discipline.

It has 21 chapters. Gonzalez and Richard E. The leading textbook in its field for more than twenty years, it continues its cutting-edge focus on contemporary developments in all mainstream areas of image processing--e. All versions of Windows File size: It is completely optional and can be uninstalled anytime through the regular software uninstall process. We recommend saving any open files in PDF Suite. Engineering Electromagnetics Sixth Edition by Wi Color fundamentals, color models, pseudo color image processing, basics of fullcolor image processing, color transforms, smoothing and sharpening, color segmentation p.

Next Generation CarsRoads were made to enable aut So to gain knowledge on digital image processing, you can download this book. Samsung Galaxy Tab Hope that you'll like it. In bio-medical technology, image processing has a great value, also in this mordern world, if you want to go ahead with the technology, then i think that you may need to have the knowledge of digital image processing.

Hello all, today i'm going to share with you a C programming learning book named as Teach Yourself C in 21 Days. Newer Post Older Post. Solving programming puzzles is a fun way to develop your logical and problem solving abilities. Which iOS 6 features can my device run? Woods PDF has become the standard file format for document exchange. This site is an authorized reseller of the "PDF Suite" software application.

Please leave a comment. Engineering Electromagnetics Sixth Edition by Wi Samsung Galaxy Tab The Adode and Acrobat trademarks and copyrights are used for comparison and reference for the users only; they belong to Adobe Systems Inc and can be found at the following url: It focuses on material that is fundamental and has a broad scope of application.

Tuesday, June 19, digital image processing by gonzalez pdf ebook free download. The Microcontroller and Embedded Systems Usin As in this modern world digital image processing has great value, so i think that everyone should have some knowledge of digital image processing. PDF Suite will prompt you to close the application, should it be open during the removal process. Hope that you'll like it.

What is a Microprocessor?Baxes, G. The objective in this type of inspection is to find damaged or incor- rectly manufactured implants automatically, prior to packaging.

Due to their resolution and 3-D capabilities, CAT scans revolutionized medicine from the moment they first became available in the early s. An example in the first category is light reflected from a planar surface. Finally, if both p1 and p3 are 1 the length of the shortest m-path between p and p4 is 4. Downlaod a program to the taskbarYou can pin a gonzalez digital image processing pdf free download Completely self-contained--and heavily illustrated--this introduction to basic concepts and methodologies for digital image processing is written at a level that truly is suitable for seniors and first-year graduate students in almost any technical gonzalez digital image processing pdf free download.

Fig- ure 2. The sound waves travel into the body and hit a boundary between tissues e.

CAYLA from Salinas
I do love reading novels slowly . Browse my other articles. I have a variety of hobbies, like antiquities.
>