Wednesday, May 21, 2014

Fundamentals of Digital Image and Video Processing - Week 8 Solutions

Hi Coursera people,

I never knew this week would be so much interesting! I could understand almost every bit of information spoken in the class! :)

Here is the question number 7 of week 8.

In this problem, you will write a MATLAB program to compute the entropy of a given gray-scale image. Follow the instructions below to finish this problem. (1) Download the input image from here. The input is a gray-scale image with pixel values in the range [0,255]. Treat the pixel intensities in this image as symbols emitted from a DMS. (2) Build a probability model (i.e., an alphabet with associated probabilities) corresponding to this input image. Specifically, this alphabet consists of symbols {0,1,2,⋯,255}. In order to find the probabilities associated with each symbol, you will need to scan over all the pixels in this image, and for each pixel, adjust the probability associated with that pixel's intensity value accordingly, or in other words find the histogram of the image. Make sure you normalize the probability model correctly such that each probability is a real-valued number in [0,1]. (3) Compute the entropy using the formula that you have learned in class. Enter the result below to at least 2 decimal points.

Here is my attempt at the code.

A = imread('C:\~Coursera courses\Image and Video processing\Week 8\Cameraman256.bmp');
for i = 1:256
DMS(i,1) = i-1;
DMS(i,2) = 0;
end
for i = 1:256
for j = 1:256
for k = 1:256
if A(i,j) == DMS(k,1)
DMS(k,2) = DMS(k,2)+1;
end
end
end
end
sum = 0;
for i = 1:256
sum = sum + DMS(i,2);
end
for i = 1:256
prob(i) = DMS(i,2)/sum;
end
ans=0;
for i = 1:256
entropy(i) = -1 * prob(i) * log2(prob(i));
ans = ans + entropy(i);
end
ans

-Cheers,
Vijay.

21 comments:

  1. Can you give the detail solutions for these HW7 and HW8 with many thanks as i have problems with Matlab

    ReplyDelete
  2. Vijay Can You Give The Answer To Ques4, Week 8 :
    Given a discrete memoryless source (DMS) with alphabet S={a,b,c} and associated probabilities p(a)=0.2, p(b)=0.5 and p(c)=0.3, what is the entropy of this source in unit of bits? Enter the answer to 3 decimal points.

    ReplyDelete
  3. Hi Tham Kim,

    Find the explanation as follows...

    A = imread('C:\~Coursera courses\Image and Video processing\Week 8\Cameraman256.bmp'); //read the image
    for i = 1:256 //for loop running 256 times as row length of the image is 256
    DMS(i,1) = i-1; //initialize the first column with values from 0 to 255
    DMS(i,2) = 0; //initialize the second column with value 0
    end
    for i = 1:256 //for loop running 256 times as row length of the image is 256
    for j = 1:256 //for loop running 256 times as column length of the image is 256
    for k = 1:256 //another loop for the below if condition
    if A(i,j) == DMS(k,1) //keeps count of how many pixels have similar values of intensities
    DMS(k,2) = DMS(k,2)+1; //increments the respective value in the array
    end
    end
    end
    end
    sum = 0;
    for i = 1:256 //for loop to count the total of all the values present in second column of DMS
    sum = sum + DMS(i,2);
    end
    for i = 1:256 //normalize each value to calculate probability
    prob(i) = DMS(i,2)/sum;
    end
    ans=0; //declare and initialize a variable to store the answer
    for i = 1:256 //for loop to calculate entropy
    entropy(i) = -1 * prob(i) * log2(prob(i));
    ans = ans + entropy(i); //add all the entropies
    end
    ans //print the answer



    If you have any other doubts, feel free to ping me again.

    ReplyDelete
  4. Hi Sanchit Goel,

    You can apply the same formula as used in the class to calculate entropy.

    entropy = - prob(x) * log(probl(x))
    i.e. answer = -0.2 * log(0.2) + -0.3 * log(0.3) + -0.5 * log(0.5)

    Tell me if you still don't get the correct answer.

    ReplyDelete
    Replies
    1. I Got 0.477 But It Seems To Be InCorrect After I Input The Value In Quiz.

      Delete
  5. Can u give solution for this week 8 question no : 4.....
    Given a discrete memoryless source (DMS) with alphabet S={a,b,c} and associated probabilities p(a)=0.2, p(b)=0.5 and p(c)=0.3, what is the entropy of this source in unit of bits? Enter the answer to 3 decimal points.

    ReplyDelete
  6. Hi Vijay Karthick,

    Thank you for your explanation. I have tried and have faced similar same to Goel. :(

    ReplyDelete
  7. Hi Vijay....
    Can u give me Answer for this week 8 question no : 4.....

    i tried but answer is incorrect.... So send me Answer

    ReplyDelete
  8. Hi Vijay...

    Can u give me Answer for this week 8 question no : 4.....

    ReplyDelete
  9. At least help yourself a little bit... Kindly put some more effort... Question 4 is a very easy question if you can just do little work by yourself and use your bloody hands to just GOOGLE the word ENTROPY. Kindly read a little to understand the meaning behind the question.
    Btw for question 4 if u make an extra effort by just changing the base of logarithm then u shall find what you are looking for.. But I must add - To stop being a child and being spoon fed. Try to add some more effort pls. Don't be pathetic.
    And I'm very thankful to Vijay Karthick for helping us sooo much.... To be truthful to you bro, third week I had to literally copy your program but from next week onwards I first tried myself then checked with your answer. Again... Thanks for the help bro..

    ReplyDelete
  10. Hi PAL,

    Thanks for the reply. I withheld from replying for the people to find out the answer on their own! Great to know that you are trying to solve problems on your own self! Kudos!

    And for the others, use log to the base 2. If you type normal log in Matlab, it is to the base 10.

    ReplyDelete
  11. Thanks to Pal and Vijay....

    Now i got the answer.....

    ReplyDelete
  12. hi vijay

    can we use direct entropy fn?
    I = imread('circuit.tif');
    J = entropy(I)

    ReplyDelete
  13. when am using it following error occurs

    "Subscript indices must either be real positive integers or logicals."

    ReplyDelete
  14. A short solution:
    img = imread('Cameraman256.bmp');
    % Calculating the Histogram
    H = imhist(img);
    % Normalizing the Histogram
    H = H/sum(H);
    Entropy = sum(H.*log2(1./H))

    ReplyDelete
  15. This comment has been removed by the author.

    ReplyDelete
  16. An easier and a logical solution is as follows:
    cam=imread('Cameraman256.bmp');
    >> [l,w]=size(cam);
    >> count=zeros(256,1);
    >> for i=1:l
    for j=1:w
    for k=1:256
    if cam(i,j)==k-1
    count(k)=count(k)+1;
    end
    end
    end
    end
    >> sum=0;
    >> for i=1:256
    sum=sum+count(i,1);
    end
    >> for i=1:256
    count(i)=(count(i))./sum;
    end
    >> info=zeros(256,1);
    >> for i=1:256
    info(i,1)=log2(1/(count(i)));
    end
    >> entropy=zeros(256,1);
    >> for i=1:256
    entropy=info(i,1).*count(i,1);
    end
    >> entropy=zeros(256,1);
    >> for i=1:256
    entropy(i,1)=info(i,1).*count(i,1);
    end
    >> sum1=0;
    >> for i=1:256
    sum1=sum1+entropy(i,1);
    end

    The answer is sum1 which equals to 7.10 :)

    ReplyDelete
  17. Really wonderful information. Thanks for it. For more information about Best Digital Marketing Services click on the link.

    ReplyDelete
  18. This article is very detailed and meticulous. I have read many articles on this topic, but for this article, you left a deep impression and practical application to my life. Thank you for sharing.

    Casino Game Development

    ReplyDelete
  19. Awesome blog!! I appreciate your work's very informative content. It helps many people to understand about fundamentals of digital image and video processing. I love the way you write & explain each thing clearly. Thanks for sharing!!

    White Label Fantasy Sports Software Development

    ReplyDelete