Tensorflow图像处理
主要内容如下:
- 加载图像
- 图像格式
- 图像转换为TFRecord格式
- 读取TFRecord文件
- 图像处理
- 数据读取方式Dataset API
一.加载图像
Tensorflow对图像文件的加载和对二进制文件的加载相同,只是图像的内容需要解码.这里介绍常用的两种方式:第一种是把图片看作一个图片直接读进来,获取图片的原始数据,再进行解码;如使用tf.gfile.FastGFile()
读取图像文件,然后,利用tf.image.decode_jpeg()
或"tf.image.decode_pgn()"进行解码.代码如下:
import matplotlib.pyplot as plt
import tensorflow as tf
%matplotlib inline
image_raw_data_jpg=tf.gfile.FastGFile("./data/image/cat/cat.jpg","rb").read()
with tf.Session() as session:
img_data=tf.image.decode_jpeg(image_raw_data_jpg)
# 图像显示
plt.figure(1)
# 显示图像矩阵
print(session.run(img_data))
plt.imshow(img_data.eval())
[[[255 255 255]
[255 255 255]
[255 255 255]
...
[255 255 255]
[255 255 255]
[255 255 255]]
[[255 255 255]
[255 255 255]
[255 255 255]
...
[255 255 255]
[255 255 255]
[255 255 255]]
[[255 255 255]
[255 255 255]
[255 255 255]
...
[255 255 255]
[255 255 255]
[255 255 255]]
...
[[255 255 255]
[255 255 255]
[255 255 255]
...
[255 255 255]
[255 255 255]
[255 255 255]]
[[255 255 255]
[255 255 255]
[255 255 255]
...
[255 255 255]
[255 255 255]
[255 255 255]]
[[255 255 255]
[255 255 255]
[255 255 255]
...
[255 255 255]
[255 255 255]
[255 255 255]]]
第一种方法不太适合读取批量数据,批量读取可以采用另一种方法,这种方法他图像看成一个文件,用队列的方式读取.在Tensorflow中,队列不仅是一种数据结构,更提供了多线程机制.首先,使用tf.train.string_input_producer
找到所需文件,并将其加载到一个队列中,然后使用tf.WholeFileReader()
把完整的文件加载到内存中,用read
从队列中读取图像文件,最后用tf.image.decode_jpeg()
或"tf.image.decode_png()"进行解码.代码如下:
import tensorflow as tf
path="./data/image/cat/cat.jpg"
# 创建输入队列
file_queue=tf.train.string_input_producer([path])
image_reader=tf.WholeFileReader()
_,image=image_reader.read(file_queue)
image=tf.image.decode_jpeg(image)
with tf.Session() as session:
# 协同启动的线程
coord=tf.train.Coordinator()
# 启动线程运行队列
threads=tf.train.start_queue_runners(sess=session,coord=coord)
print(session.run(image))
# 停止所有的线程
coord.request_stop()
coord.join(threads)
[[[255 255 255]
[255 255 255]
[255 255 255]
...
[255 255 255]
[255 255 255]
[255 255 255]]
[[255 255 255]
[255 255 255]
[255 255 255]
...
[255 255 255]
[255 255 255]
[255 255 255]]
[[255 255 255]
[255 255 255]
[255 255 255]
...
[255 255 255]
[255 255 255]
[255 255 255]]
...
[[255 255 255]
[255 255 255]
[255 255 255]
...
[255 255 255]
[255 255 255]
[255 255 255]]
[[255 255 255]
[255 255 255]
[255 255 255]
...
[255 255 255]
[255 255 255]
[255 255 255]]
[[255 255 255]
[255 255 255]
[255 255 255]
...
[255 255 255]
[255 255 255]
[255 255 255]]]
二.图像格式
图像中的重要信息通过某种恰当的文件格式存储.在使用图像时,不同的格式可用于解决不同的问题.Tensorflow支持多种图像格式,常用的包括JPEG,PNG,TFRecord图像格式
1.JPEG和PNG
Tensorflow支持JPEG,PNG两种图像格式,这里以这两种为代表,是因为将其他格式转换为这两种格式非常容易.JPEG格式图像使用tf.image.decode_jpeg()
解码,PNG格式图像使用tf.image.decode_png
解码
2.TFRecord
对于大量的图像数据,Tensorflow采用TFRecord格式,将图像数据和图像的标签数据以二进制形式无压缩方式存储在TFRecord文件中,这样其可以被快速载入内存,方便移动,复制和处理.TFRecord
文件中的数据都是通过tf.train.Example
Protocol Buffer的格式存储的
message Example{
Features features=1;
};
message Features{
map<string,Feature> feature=1;
};
message Feature{
oneof kind{
ByteList bytes_list=1;
FloatList float_list=2;
Int64List int64_list=3;
}
};
tf.train.Example
中包含了一个从属性名称到取值的字典.其中属性名称为一个字符串,属性的取值可以为字符串(BytesList),实数列表(FloatList)或者整数列表(Int64List).BytesList存放图像数据,Int64List存放图像的标签对应的整数
三.图像转换为TFRecord文件
把图像转换成TFRecord格式的文件就是把图像数据和图像标签的数据按tf.train.Example
数据结构存储成二进制文件的过程.假设我们已经按图像标签吧若干张猫的图像和若干张狗的图片分别存在了以下目录:"./data/image/cat/“和”./data/image/dog/",以下代码可以把这些图片转换成TFRecord文件cat_dog.tfrecord,存放在"./data"目录下
import os
import tensorflow as tf
from PIL import Image # 处理图像包
import numpy as np
folder="./data/image/"
savefolder="./data/"
# 设定图像类别标签,标签名称和对应目录名相同
label={"cat","dog"}
# 生成的文件
writer=tf.python_io.TFRecordWriter(savefolder+"cat_dog.tfrecords")
# 记录图像的个数
count=0
for index,name in enumerate(label):
folder_path=folder+name+"/"
for img_name in os.listdir(folder_path):
# 每一个图片的完整访问路径
img_path=folder_path+img_name
img=Image.open(img_path)
img=img.resize((128,128))
# 将图片转化为二进制格式
img_raw=img.tobytes()
# example对象对label和image数据进行封装
example=tf.train.Example(features=tf.train.Features(feature={
"label":tf.train.Feature(int64_list=tf.train.Int64List(value=[index])),
"img_raw":tf.train.Feature(bytes_list=tf.train.BytesList(value=[img_raw]))
}))
# 序列化为字符串
writer.write(example.SerializeToString())
count=count+1
writer.close()
四.读取TFRecord文件
读取TFRecord文件首先采用输入生成器(tf.train.string_input_producer
)将其加载到一个队列,然后读出(tf.TFRecordReader
)并将(tf.parse_single_example
)Example对象拆解为图像数据和标签数据,最后对图像数据解码并保持成JPG文件
# 定义数据流队列
filename_queue=tf.train.string_input_producer([savefolder+"cat_dog.tfrecords"])
reader=tf.TFRecordReader()
# 返回文件名和文件
_,serialized_example=reader.read(filename_queue)
features=tf.parse_single_example(serialized_example,
features={
"label":tf.FixedLenFeature([],tf.int64),
"img_raw":tf.FixedLenFeature([],tf.string)
})
# 取出包含image和label的feature对象
image=tf.decode_raw(features["img_raw"],tf.uint8)
image=tf.reshape(image,[128,128,3])
label=tf.cast(features["label"],tf.int32)
with tf.Session() as session:
init_op=tf.initialize_all_variables()
session.run(init_op)
coord=tf.train.Coordinator()
# 线程协调器
threads=tf.train.start_queue_runners(coord=coord)
# 以多线程的方式启动对队列的处理
for i in range(count):
# 在会话中取出image和label
example,l=session.run([image,label])
img=Image.fromarray(example,"RGB")
img.save(folder+str(i)+"_label_"+str(l)+".jpg")
print(example,l)
coord.request_stop()
# 等待所有进程结束
coord.join(threads)
[[[142 143 137]
[145 146 140]
[148 149 141]
...
[149 150 119]
[149 150 119]
[147 148 116]]
[[142 143 137]
[142 143 135]
[142 143 135]
...
[146 146 120]
[145 145 117]
[143 143 115]]
[[139 140 132]
[136 138 127]
[134 136 125]
...
[142 143 125]
[141 143 122]
[139 141 120]]
...
[[228 227 233]
[222 221 227]
[219 218 224]
...
[211 217 229]
[210 218 229]
[208 218 227]]
[[220 219 227]
[223 222 230]
[227 227 235]
...
[209 219 231]
[208 220 232]
[208 221 230]]
[[220 219 227]
[215 214 222]
[215 215 225]
...
[208 220 234]
[209 221 233]
[209 223 234]]] 0
[[[ 64 54 44]
[ 64 54 44]
[ 64 56 45]
...
[ 6 4 5]
[ 2 5 10]
[ 1 5 14]]
[[ 66 56 46]
[ 67 57 47]
[ 66 58 47]
...
[ 6 4 5]
[ 2 5 10]
[ 1 5 14]]
[[ 69 59 49]
[ 70 60 50]
[ 70 62 51]
...
[ 6 4 5]
[ 2 5 10]
[ 1 6 12]]
...
[[110 90 65]
[107 87 62]
[107 87 60]
...
[117 95 71]
[118 97 66]
[112 92 59]]
[[113 93 68]
[111 91 66]
[112 92 65]
...
[120 98 74]
[121 100 71]
[113 92 61]]
[[110 90 65]
[109 89 64]
[111 91 64]
...
[120 98 74]
[122 101 72]
[113 92 61]]] 0
[[[148 156 158]
[134 142 144]
[125 131 131]
...
[ 15 24 21]
[ 17 26 23]
[ 18 27 24]]
[[149 157 159]
[135 143 145]
[126 132 132]
...
[ 19 28 25]
[ 20 29 26]
[ 22 31 28]]
[[154 162 164]
[139 147 149]
[129 135 135]
...
[ 22 31 28]
[ 22 31 28]
[ 23 32 29]]
...
[[146 151 154]
[132 137 140]
[119 123 124]
...
[ 15 24 21]
[ 13 22 19]
[ 16 25 22]]
[[147 152 155]
[132 137 140]
[121 125 126]
...
[ 15 24 21]
[ 13 22 19]
[ 17 26 23]]
[[140 145 148]
[126 131 134]
[115 119 120]
...
[ 18 27 24]
[ 17 26 23]
[ 21 30 27]]] 0
[[[ 16 11 5]
[ 16 11 5]
[ 17 12 6]
...
[ 28 15 6]
[ 29 16 7]
[ 31 18 9]]
[[ 15 10 4]
[ 16 11 5]
[ 17 12 6]
...
[ 28 15 6]
[ 30 17 8]
[ 32 19 10]]
[[ 14 9 3]
[ 15 10 4]
[ 16 11 5]
...
[ 29 16 7]
[ 31 18 9]
[ 33 20 11]]
...
[[155 116 57]
[147 108 49]
[135 96 39]
...
[ 78 48 20]
[ 79 49 23]
[ 79 49 25]]
[[134 98 38]
[142 106 46]
[152 116 58]
...
[ 83 55 31]
[ 85 59 36]
[ 87 60 39]]
[[136 100 40]
[136 100 40]
[148 111 56]
...
[ 96 70 47]
[ 97 72 50]
[ 99 74 54]]] 0
[[[206 180 121]
[199 173 114]
[194 168 109]
...
[187 163 103]
[187 163 103]
[187 163 103]]
[[196 170 111]
[189 163 104]
[184 158 99]
...
[178 154 94]
[177 153 93]
[177 153 93]]
[[195 169 110]
[187 161 102]
[183 157 98]
...
[177 153 93]
[176 152 92]
[176 152 92]]
...
[[ 22 19 4]
[ 22 19 4]
[ 22 19 4]
...
[ 14 14 2]
[ 14 14 2]
[ 14 14 2]]
[[ 19 15 3]
[ 20 16 4]
[ 22 18 6]
...
[ 14 14 2]
[ 14 14 2]
[ 14 14 2]]
[[ 23 19 7]
[ 24 20 8]
[ 25 21 9]
...
[ 14 14 2]
[ 14 14 2]
[ 14 14 2]]] 0
[[[ 6 10 13]
[ 7 11 14]
[ 8 12 15]
...
[ 17 17 15]
[ 15 15 13]
[ 15 15 13]]
[[ 6 10 13]
[ 6 10 13]
[ 9 13 16]
...
[ 21 21 19]
[ 20 20 18]
[ 20 20 18]]
[[ 5 9 12]
[ 5 9 12]
[ 9 13 16]
...
[ 18 18 16]
[ 18 18 16]
[ 17 17 15]]
...
[[ 56 46 44]
[ 86 76 74]
[132 122 120]
...
[ 80 70 69]
[108 98 97]
[ 95 85 84]]
[[ 35 25 23]
[ 79 69 67]
[ 92 82 80]
...
[112 102 101]
[114 104 103]
[114 104 103]]
[[ 87 77 75]
[ 80 70 68]
[131 121 119]
...
[116 106 105]
[130 120 119]
[ 97 87 86]]] 0
[[[157 172 201]
[ 78 94 120]
[166 182 208]
...
[135 120 125]
[131 116 123]
[ 95 80 87]]
[[153 168 197]
[ 75 91 117]
[167 183 208]
...
[135 120 125]
[127 112 119]
[ 92 77 84]]
[[153 169 195]
[ 77 93 119]
[170 186 211]
...
[134 119 124]
[121 106 113]
[ 88 73 80]]
...
[[ 58 60 39]
[ 69 71 50]
[ 84 86 65]
...
[ 46 43 52]
[ 54 51 62]
[ 43 40 49]]
[[ 70 73 52]
[ 78 81 60]
[ 89 92 71]
...
[ 50 47 54]
[ 49 46 55]
[ 53 50 57]]
[[ 85 88 67]
[ 87 90 69]
[ 91 94 73]
...
[ 56 53 60]
[ 62 59 66]
[ 59 56 63]]] 0
[[[ 43 140 172]
[ 44 141 173]
[ 61 158 190]
...
[ 51 154 189]
[ 37 140 175]
[ 27 130 165]]
[[ 51 148 180]
[ 45 142 174]
[ 49 146 178]
...
[ 39 142 177]
[ 37 140 175]
[ 44 147 182]]
[[ 47 146 177]
[ 49 148 179]
[ 50 149 180]
...
[ 40 143 176]
[ 20 123 156]
[ 29 132 165]]
...
[[ 46 143 175]
[ 41 138 170]
[ 12 107 139]
...
[ 31 136 166]
[ 42 147 177]
[ 47 152 182]]
[[ 33 128 160]
[ 49 144 176]
[ 36 131 163]
...
[ 32 137 167]
[ 44 149 179]
[ 26 131 161]]
[[ 32 127 159]
[ 40 135 167]
[ 45 138 171]
...
[ 20 125 155]
[ 47 152 182]
[ 40 145 175]]] 0
[[[ 40 54 19]
[ 65 78 48]
[ 80 89 68]
...
[ 41 72 152]
[ 29 60 124]
[ 27 58 113]]
[[ 53 64 32]
[ 46 57 27]
[ 55 64 43]
...
[ 44 74 146]
[ 50 79 135]
[ 22 52 102]]
[[ 29 38 9]
[ 74 83 56]
[ 69 77 56]
...
[ 29 56 111]
[ 0 18 62]
[ 1 27 64]]
...
[[249 235 209]
[252 238 211]
[255 244 217]
...
[213 197 182]
[186 169 159]
[253 236 228]]
[[255 252 226]
[234 219 190]
[164 149 120]
...
[234 218 202]
[143 126 116]
[214 197 189]]
[[157 140 112]
[227 210 182]
[255 247 217]
...
[161 145 129]
[169 152 142]
[213 196 188]]] 0
[[[153 211 101]
[160 218 106]
[162 220 108]
...
[165 237 111]
[163 238 109]
[156 233 101]]
[[155 211 104]
[147 205 95]
[143 199 88]
...
[159 231 105]
[154 231 101]
[147 227 94]]
[[167 220 116]
[155 211 104]
[149 202 96]
...
[151 223 98]
[148 225 95]
[141 223 89]]
...
[[ 66 113 17]
[105 150 57]
[123 163 77]
...
[ 49 100 5]
[142 196 100]
[132 186 90]]
[[120 175 73]
[180 232 132]
[ 92 137 46]
...
[100 152 54]
[149 203 105]
[122 176 78]]
[[ 64 121 16]
[ 98 153 51]
[ 57 104 10]
...
[103 155 55]
[154 208 110]
[142 196 98]]] 0
[[[ 53 35 15]
[ 53 35 15]
[ 53 35 15]
...
[ 36 9 14]
[ 41 10 16]
[ 42 9 16]]
[[ 55 37 17]
[ 55 37 17]
[ 55 37 17]
...
[ 37 10 15]
[ 41 10 16]
[ 42 9 16]]
[[ 58 40 18]
[ 58 40 18]
[ 58 40 18]
...
[ 37 11 14]
[ 41 12 16]
[ 42 11 16]]
...
[[ 98 84 57]
[ 93 79 52]
[ 98 84 57]
...
[142 126 100]
[138 122 96]
[135 119 93]]
[[104 90 63]
[ 95 81 54]
[100 86 59]
...
[109 93 67]
[148 132 106]
[136 120 94]]
[[104 90 63]
[ 93 79 52]
[ 97 83 56]
...
[116 100 74]
[116 100 74]
[128 112 86]]] 1
[[[181 172 215]
[161 154 196]
[187 185 225]
...
[ 83 104 48]
[ 79 101 39]
[ 50 72 7]]
[[179 173 211]
[189 185 222]
[223 223 255]
...
[ 81 101 48]
[ 85 107 45]
[ 77 99 34]]
[[208 205 232]
[227 227 255]
[241 244 255]
...
[ 75 95 42]
[ 79 100 41]
[ 94 116 52]]
...
[[ 55 95 165]
[ 62 102 172]
[ 68 108 178]
...
[109 153 228]
[107 151 226]
[105 149 224]]
[[ 57 96 165]
[ 60 99 168]
[ 64 103 172]
...
[ 82 127 208]
[ 83 128 209]
[ 84 129 210]]
[[ 57 96 165]
[ 56 95 164]
[ 56 95 164]
...
[ 66 111 196]
[ 70 115 200]
[ 73 118 203]]] 1
[[[245 244 213]
[239 236 203]
[255 255 221]
...
[193 189 142]
[193 189 141]
[204 200 152]]
[[255 255 223]
[230 224 188]
[255 252 214]
...
[186 182 137]
[194 190 145]
[234 230 183]]
[[240 231 188]
[134 121 77]
[134 116 70]
...
[ 86 81 43]
[101 96 58]
[181 176 136]]
...
[[195 190 161]
[140 134 110]
[135 128 112]
...
[155 144 124]
[152 141 113]
[235 224 192]]
[[199 194 165]
[144 138 114]
[139 132 114]
...
[157 145 123]
[152 138 109]
[233 220 186]]
[[197 192 163]
[142 136 112]
[136 129 111]
...
[158 146 122]
[150 136 107]
[228 215 181]]] 1
[[[151 143 140]
[151 143 140]
[128 120 117]
...
[ 73 63 51]
[113 103 93]
[ 96 86 76]]
[[151 143 140]
[126 118 115]
[153 145 142]
...
[ 85 77 66]
[117 107 97]
[105 97 86]]
[[162 155 149]
[173 166 160]
[163 156 150]
...
[ 70 63 53]
[ 92 83 74]
[ 93 86 76]]
...
[[167 165 153]
[124 127 120]
[100 109 114]
...
[ 96 82 55]
[ 47 63 26]
[ 97 128 86]]
[[140 141 143]
[102 109 115]
[ 84 100 115]
...
[ 99 64 42]
[ 79 85 47]
[ 69 96 51]]
[[ 95 100 106]
[103 113 123]
[ 88 104 127]
...
[110 63 43]
[ 79 81 42]
[ 81 107 60]]] 1
[[[23 28 48]
[27 32 51]
[59 62 81]
...
[39 38 36]
[28 27 25]
[19 18 16]]
[[18 23 45]
[12 17 37]
[ 4 7 26]
...
[19 18 16]
[24 23 21]
[29 28 26]]
[[11 17 41]
[34 40 64]
[24 29 51]
...
[13 12 10]
[18 17 15]
[20 19 17]]
...
[[25 25 23]
[24 24 22]
[19 19 17]
...
[ 7 3 2]
[11 7 4]
[19 15 12]]
[[29 30 25]
[30 31 26]
[28 29 24]
...
[10 4 4]
[16 10 10]
[25 19 19]]
[[16 17 12]
[22 23 18]
[25 26 21]
...
[13 7 7]
[21 12 13]
[28 19 20]]] 1
[[[57 51 51]
[45 37 34]
[41 28 22]
...
[26 5 4]
[16 0 0]
[24 3 0]]
[[ 5 0 0]
[ 6 0 0]
[26 15 9]
...
[26 5 2]
[34 13 10]
[19 0 0]]
[[19 15 16]
[40 35 32]
[28 19 14]
...
[36 12 10]
[63 39 37]
[45 21 17]]
...
[[13 0 4]
[11 0 5]
[ 7 2 8]
...
[18 0 0]
[17 0 0]
[20 2 0]]
[[18 0 5]
[11 0 2]
[ 4 0 4]
...
[22 3 0]
[23 4 0]
[15 0 0]]
[[17 0 2]
[22 2 11]
[12 5 12]
...
[25 4 0]
[34 15 11]
[46 27 23]]] 1
[[[ 14 21 39]
[ 0 5 23]
[ 6 15 32]
...
[ 36 48 60]
[ 34 46 58]
[ 28 40 52]]
[[ 13 20 38]
[ 1 8 26]
[ 8 17 34]
...
[ 34 46 58]
[ 31 43 55]
[ 26 38 50]]
[[ 8 15 33]
[ 2 9 27]
[ 6 15 32]
...
[ 33 45 57]
[ 30 42 54]
[ 24 36 48]]
...
[[119 122 127]
[112 115 120]
[112 115 120]
...
[ 52 72 70]
[ 46 66 64]
[ 86 106 104]]
[[142 145 150]
[117 120 125]
[102 105 110]
...
[ 64 86 84]
[ 55 77 75]
[ 63 85 83]]
[[137 140 145]
[120 123 128]
[112 115 120]
...
[ 50 72 70]
[ 63 85 83]
[ 53 75 73]]] 1
[[[ 61 83 62]
[ 68 90 69]
[ 74 96 75]
...
[109 135 100]
[107 134 99]
[104 131 96]]
[[ 64 86 65]
[ 68 90 69]
[ 72 94 73]
...
[110 136 101]
[107 134 99]
[105 132 97]]
[[ 67 89 68]
[ 69 91 70]
[ 71 93 72]
...
[112 138 103]
[112 139 104]
[112 139 104]]
...
[[118 138 110]
[116 136 108]
[108 128 100]
...
[ 86 110 88]
[ 81 104 84]
[ 80 103 83]]
[[106 127 96]
[112 133 102]
[114 135 104]
...
[ 93 117 95]
[ 87 110 90]
[ 78 101 81]]
[[ 82 103 72]
[ 94 115 84]
[103 124 93]
...
[ 96 120 98]
[102 125 105]
[100 123 103]]] 1
[[[ 78 74 29]
[ 80 76 31]
[ 61 57 10]
...
[ 3 25 22]
[ 19 41 38]
[ 32 54 51]]
[[ 82 78 33]
[ 88 84 39]
[ 80 76 29]
...
[ 8 30 27]
[ 12 34 31]
[ 30 52 49]]
[[ 64 60 15]
[ 71 67 22]
[ 78 74 29]
...
[ 42 63 58]
[ 24 45 40]
[ 25 46 41]]
...
[[146 132 106]
[151 137 111]
[155 141 115]
...
[114 105 72]
[121 115 81]
[113 107 73]]
[[141 129 105]
[147 135 111]
[153 139 113]
...
[ 97 88 55]
[103 97 63]
[100 94 60]]
[[142 130 106]
[148 136 112]
[153 139 113]
...
[134 125 92]
[129 123 89]
[118 112 78]]] 1
[[[123 121 109]
[133 131 119]
[ 87 80 70]
...
[122 108 97]
[155 141 130]
[136 122 111]]
[[ 95 93 81]
[ 96 92 81]
[ 79 72 62]
...
[136 122 111]
[136 122 111]
[111 97 86]]
[[ 95 91 80]
[ 48 44 33]
[104 97 87]
...
[ 55 41 30]
[162 148 137]
[130 116 105]]
...
[[114 100 91]
[ 46 32 23]
[ 97 83 74]
...
[151 163 143]
[ 73 82 63]
[114 121 103]]
[[180 166 157]
[151 137 128]
[ 92 78 69]
...
[152 164 144]
[ 50 59 40]
[130 139 120]]
[[107 93 84]
[ 14 0 0]
[198 184 175]
...
[ 77 90 70]
[100 109 90]
[100 109 90]]] 1
[[[255 255 255]
[255 255 255]
[255 255 255]
...
[255 255 255]
[255 255 255]
[255 255 255]]
[[255 255 255]
[255 255 255]
[255 255 255]
...
[255 255 255]
[255 255 255]
[255 255 255]]
[[255 255 255]
[255 255 255]
[255 255 255]
...
[255 255 255]
[255 255 255]
[255 255 255]]
...
[[255 255 255]
[255 255 255]
[255 255 255]
...
[253 255 254]
[253 255 254]
[253 255 254]]
[[255 255 255]
[255 255 255]
[255 255 255]
...
[255 255 255]
[255 255 255]
[255 255 255]]
[[255 255 255]
[255 255 255]
[255 255 255]
...
[255 255 255]
[255 255 255]
[255 255 255]]] 1
from IPython.display import Image
Image(filename="./data/image/cat/cat.jpg",width=600)
五.图像处理序列
利用tf.image
API,该API包含了很多图像处理的函数.代码基于一个600*600大小的猫的图像
1.调整大小
image_raw_data=tf.gfile.FastGFile("./data/image/cat/cat.jpg","rb").read()
img_data=tf.image.decode_jpeg(image_raw_data)
with tf.Session() as session:
resized=tf.image.resize_images(img_data,[300,300],method=0)
cat=np.asarray(resized.eval(),dtype="uint8")
plt.imshow(cat)
plt.show()
2.裁剪和填充图像
with tf.Session() as session:
croped=tf.image.resize_image_with_crop_or_pad(img_data,300,300)
padded=tf.image.resize_image_with_crop_or_pad(img_data,3000,3000)
plt.imshow(croped.eval())
plt.show()
plt.imshow(padded.eval())
plt.show()
3.对角线翻转图像
with tf.Session() as session:
transposed=tf.image.transpose_image(img_data)
plt.imshow(transposed.eval())
plt.show()
4.色彩调整
with tf.Session() as session:
adjusted=tf.image.random_brightness(img_data,max_delta=0.5)
plt.imshow(adjusted.eval())
plt.show()
5.色调饱和度
with tf.Session() as session:
adjusted=tf.image.adjust_hue(img_data,0.1)
plt.imshow(adjusted.eval())
plt.show()
六.数据读取方式–Dataset API
1.构建Dataset
tf.data.Dataset.from_tensor_slices()
利用tf.data.Dataset.from_tensor_slices()
从一个或多个tf.Tensor对象中构建一个dataset,其tf.Tensor对象中包括数组,矩阵,字典,元组等.具体实例如下:
import tensorflow as tf
import numpy as np
arry1=np.array([1.0,2.0,3.0,4.0,5.0])
dataset=tf.data.Dataset.from_tensor_slices(arry1)
iterator=dataset.make_one_shot_iterator()
one_element=iterator.get_next()
with tf.Session() as session:
for i in range(len(arry1)):
print(session.run(one_element))
1.0
2.0
3.0
4.0
5.0
Dataset的转换(transformations)
datasets支持任何结构,当使用Dataset.map(),Dataset.flat_map(),以及Dataset.filter()进行转换时,它们会对每个元素应用一个函数,元素结构决定了函数的参数
import tensorflow as tf
import numpy as np
a1=np.array([1.0,2.0,3.0,4.0,5.0])
dataset=tf.data.Dataset.from_tensor_slices(a1)
dataset=dataset.map(lambda x:x**2)
iterator=dataset.make_one_shot_iterator()
one_element=iterator.get_next()
with tf.Session() as session:
for i in range(len(a1)):
print(session.run(one_element))
1.0
4.0
9.0
16.0
25.0
dataset = tf.data.Dataset.range(100)
iterator = dataset.make_one_shot_iterator()
next_element = iterator.get_next()
sess=tf.Session()
for i in range(100):
value = sess.run(next_element)
assert i == value
#initializable迭代器需要在使用前进行iterator.initializer的操作,虽然不方便,但支持参数化,可以使用一个或多个 tf.placeholder() 在初始化迭代器时占位:
max_value = tf.placeholder(tf.int64, shape=[])
dataset = tf.data.Dataset.range(max_value)
iterator = dataset.make_initializable_iterator()
next_element = iterator.get_next()
dataset = tf.data.Dataset.range(5)
iterator = dataset.make_initializable_iterator()
next_element = iterator.get_next()
result=tf.add(next_element,next_element)
session.run(iterator.initializer)
print(session.run(result))
print(session.run(result))
print(session.run(result))
print(session.run(result))
print(session.run(result))
try:
session.run(result)
except tf.errors.OutOfRangeError:
print("End of dataset")
0
2
4
6
8
End of dataset
import cv2
def _read_py_function