File size: 6,024 Bytes
d157f08 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 | 1
00:00:00,760 --> 00:00:01,300
Right.
2
00:00:01,320 --> 00:00:08,130
So we've just imported the data into carious However that is not in the ideal or the correct format
3
00:00:08,130 --> 00:00:10,410
that cares needs to train on.
4
00:00:10,440 --> 00:00:15,370
So this chapter is going to be a tiny chapter in how to get us data into the correct shape.
5
00:00:15,450 --> 00:00:17,020
So let's look at the next slide.
6
00:00:17,330 --> 00:00:21,780
So as I said before we've brought this in here and help with it it's here.
7
00:00:21,780 --> 00:00:22,650
All right.
8
00:00:22,650 --> 00:00:29,340
How ever terrorists requires it to be in a special shape and that she basically as number of rows columns
9
00:00:29,910 --> 00:00:33,040
a number of samples rows of columns and Dept.
10
00:00:33,120 --> 00:00:40,880
So this is how extreme it looks here but healthcare is actually once it is 16000 to 28 and 1.
11
00:00:40,950 --> 00:00:46,090
So we sort of have to add a fourth dimension antoh data and that's how we do it here.
12
00:00:46,090 --> 00:00:47,580
So now let's do this in Iraq.
13
00:00:47,910 --> 00:00:52,980
And the reason Chris is looking for a fourth dimension here is because when you load a greyscale image
14
00:00:52,980 --> 00:00:57,930
dataset it doesn't add the dimension of the depth to up.
15
00:00:58,230 --> 00:01:00,550
Mainly because it's a greyscale and is Nakul adept.
16
00:01:00,630 --> 00:01:04,850
If it was a color image dataset you would have the tree popping up at the end here.
17
00:01:05,160 --> 00:01:10,390
However we each need to add this one just to indicate to Cara's that it's a greyscale image data set.
18
00:01:10,590 --> 00:01:17,070
Otherwise it's going to return an error is an incorrect format so to change this you would just use
19
00:01:17,070 --> 00:01:20,210
non-place reship function which is quite simple to use.
20
00:01:20,220 --> 00:01:23,170
You just use extreme thought reshape.
21
00:01:23,170 --> 00:01:24,760
Take the first 60000 digits.
22
00:01:24,810 --> 00:01:27,620
That's when you just when you use shape.
23
00:01:27,780 --> 00:01:30,640
This is a first I mentioned at 0 addresses.
24
00:01:30,750 --> 00:01:37,310
Then you enter 28 28 or image or image columns and add 1 and it's quite simple to use.
25
00:01:37,320 --> 00:01:38,880
So let's get into it.
26
00:01:39,190 --> 00:01:42,350
I buy that book now.
27
00:01:42,740 --> 00:01:43,120
OK.
28
00:01:43,160 --> 00:01:49,220
So like we just saw this is a section of code where we reshape and add the fourth dimension onto these
29
00:01:49,790 --> 00:01:50,610
treating datasets.
30
00:01:50,620 --> 00:01:52,580
Untested sets.
31
00:01:52,620 --> 00:01:57,650
I didn't point this out to you before but we just get image rows and image columns by simply addressing
32
00:01:57,650 --> 00:02:00,450
the first mention of the shape here extreme.
33
00:02:00,540 --> 00:02:08,030
Now we use extreme zero and extreme in one we don't need to use Xorn we can use X test as well because
34
00:02:08,030 --> 00:02:09,850
they're the same dimensionality.
35
00:02:09,890 --> 00:02:15,530
However this just gives us the rows and columns so let's actually print this out so you can actually
36
00:02:15,530 --> 00:02:16,990
see what's going on here.
37
00:02:17,480 --> 00:02:21,800
So this run print that you'll see it turns 28.
38
00:02:21,800 --> 00:02:23,590
So this is getting to 28 here.
39
00:02:23,960 --> 00:02:32,210
And if we were to just use this for columns two you'll see will also get 28 right.
40
00:02:32,210 --> 00:02:38,130
So moving on this is fairly simple if you guys make it nice and neat.
41
00:02:38,150 --> 00:02:41,010
Again this is how we make the input shape.
42
00:02:41,010 --> 00:02:48,340
Now you've seen before we needed input shape dimension in the first layer of the convolutional on that.
43
00:02:48,450 --> 00:02:50,870
And this is how we just create our image shape here.
44
00:02:51,060 --> 00:02:52,540
It's a tuple That's combined.
45
00:02:52,550 --> 00:02:57,560
What rows columns and the dimensionality of the depth of the image.
46
00:02:57,710 --> 00:02:58,540
This would be tree.
47
00:02:58,590 --> 00:03:04,400
Again if it was a color image dataset and because Kurus expected it to be in a floated to.
48
00:03:04,730 --> 00:03:08,090
Right now I believe it's going to be in some sort of integer format here.
49
00:03:08,490 --> 00:03:10,360
So we just need to change this.
50
00:03:10,830 --> 00:03:15,410
So we just change extra in as type flow to the two and do it for X test as well.
51
00:03:15,570 --> 00:03:18,280
We don't need to do this for the labels just the turning of data.
52
00:03:18,780 --> 00:03:25,290
And this is how we normalized data we normalize all data by dividing it by 255 because remember images
53
00:03:25,380 --> 00:03:27,270
range from 0 to 255.
54
00:03:27,530 --> 00:03:34,720
So if we divide it all the image data by 255 we basically bring it into range 0 to 1.
55
00:03:34,830 --> 00:03:36,470
So that's quite simple here now.
56
00:03:36,720 --> 00:03:40,350
And basically we just print this out again if we need to do that because we did it before.
57
00:03:40,630 --> 00:03:43,680
But just do a sanity check on our data.
58
00:03:43,770 --> 00:03:45,480
So let's run this again.
59
00:03:45,480 --> 00:03:49,770
If you want to see what extreme It actually looks like now this is it.
60
00:03:49,860 --> 00:03:53,930
You can see basically the decimal points because it's going to hidden here.
61
00:03:54,720 --> 00:03:59,870
But all the data ranges from 0 to 256 0 to 1 right now.
62
00:04:00,430 --> 00:04:00,670
OK.
63
00:04:00,690 --> 00:04:03,440
So let's move on to heart encoding labels.
|