Created
September 28, 2017 10:21
-
-
Save Link-/41db7caf14757b83ef02ae5f22ba38c5 to your computer and use it in GitHub Desktop.
Solution to a vector to matrix recoding problem
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
A is 100 dimensional vector containing integers N between [1 and 10] (inclusive) distributed at intervals of 10: | |
i.e. | |
at index x -> N | |
at index 1 -> 1 | |
at index 11 -> 2 | |
at index 21 -> 3 | |
Problem: | |
----------- | |
Need to recode A into a 10x100 (Rows x Columns) matrix such that each column is a 10 dimensional vector y with y(x) = 1 the rest is 0 | |
i.e. | |
For x between 1 and 10 we will have the same vector (y) replicated 10 times with with y = | |
111...0 | |
000...1 | |
000...0 | |
000...0 | |
000...0 | |
000...0 | |
000...0 | |
000...0 | |
000...0 | |
000...0 | |
Expected Solution | |
------------------ | |
I need a vectorized implementation of the above i.e. no for loops and index manipulation, that's easy. | |
Actual Solution in Matlab/Octave | |
--------------------------------- | |
x = [1 1 1 2] | |
A = zeros(10, 4) | |
A(sub2ind(size(A), x, 1:4)) = 1 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment