tensorflow实现CNN模型垃圾分类算法

上海开始施行垃圾分类啦。那么我们能不能通过平常学习的机器学习和深度学习的算法来实现一个简单的垃圾分类的模型呢?
下面主要用过CNN来实现垃圾的分类。在本数据集中,垃圾的种类有六种(和上海的标准不一样),分为玻璃、纸、硬纸板、塑料、金属、一般垃圾。
该数据集包含了2527个生活垃圾图片。数据集的创建者将垃圾分为了6个类别,分别是:

玻璃(glass)共501个图片
纸(paper)共594个图片
硬纸板(cardboard)共403个图片
塑料(plastic)共482个图片
金属(metal)共410个图片
一般垃圾(trash)共137个图片
物品都是放在白板上在日光/室内光源下拍摄的,压缩后的尺寸为512 * 384。

dataset from https://github.com/garythung/trashnet/tree/master/data
Unzip data/dataset-resized.zip

import numpy as np
import matplotlib.pyplot as plt
from tensorflow.keras.preprocessing.image import ImageDataGenerator, load_img, img_to_array, array_to_img
from tensorflow.keras.layers import Conv2D, Flatten, MaxPooling2D, Dense
from tensorflow.keras.models import Sequential

import glob, os, random

base_path ='/Users/zhangwenna/Desktop/dataset-resized'
img_list = glob.glob(os.path.join(base_path, '*/*.jpg'))
print(len(img_list))

# 我们总共有2527张图片。我们随机展示其中的6张图片
for i, img_path in enumerate(random.sample(img_list, 6)):
    img = load_img(img_path)
    img = img_to_array(img, dtype=np.uint8)
    
    plt.subplot(2, 3, i+1)
    plt.imshow(img.squeeze())

output1:
2527

image.png

#对数据进行分组
#ImageDataGenerator()是keras.preprocessing.image模块中的图片生成器,可以每一次给模型“喂”一个batch_size大小的样本数据,
#同时也可以在每一个批次中对这batch_size个样本数据进行增强,扩充数据集大小,增强模型的泛化能力。比如进行旋转,变形,归一化等等。

train_datagen = ImageDataGenerator(
    rescale=1./225, shear_range=0.1, zoom_range=0.1,
    width_shift_range=0.1, height_shift_range=0.1, horizontal_flip=True,
    vertical_flip=True, validation_split=0.1)
#shear_range 剪切强度(逆时针方向的剪切变换角度)
#validation_split: 保留用于验证的图像的比例(严格在0和1之间)
test_datagen = ImageDataGenerator(
    rescale=1./255, validation_split=0.1)
    
train_generator = train_datagen.flow_from_directory(
    base_path, target_size=(300, 300), batch_size=16,
    class_mode='categorical', subset='training', seed=0)

validation_generator = test_datagen.flow_from_directory(
    base_path, target_size=(300, 300), batch_size=16,
    class_mode='categorical', subset='validation', seed=0)

labels = (train_generator.class_indices)
labels = dict((v,k) for k,v in labels.items())

print(labels)

output2:
Found 2276 images belonging to 6 classes.
Found 251 images belonging to 6 classes.
{0: 'cardboard', 1: 'glass', 2: 'metal', 3: 'paper', 4: 'plastic', 5: 'trash'}

model = Sequential([
    Conv2D(filters=32, kernel_size=3, padding='same', activation='relu', input_shape=(300, 300, 3)),
    MaxPooling2D(pool_size=2),

    Conv2D(filters=64, kernel_size=3, padding='same', activation='relu'),
    MaxPooling2D(pool_size=2),
    
    Conv2D(filters=32, kernel_size=3, padding='same', activation='relu'),
    MaxPooling2D(pool_size=2),
    
    Conv2D(filters=32, kernel_size=3, padding='same', activation='relu'),
    MaxPooling2D(pool_size=2),

    Flatten(),

    Dense(64, activation='relu'),

    Dense(6, activation='softmax')
])
#(交叉熵损失函数) 
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['acc'])
model.fit_generator(train_generator, epochs=100, steps_per_epoch=2276//32,validation_data=validation_generator,
                    validation_steps=251//32)

#参数steps_per_epoch是通过把训练图像的数量除以批次大小得出的。例如,有100张图像且批次大小为50,则steps_per_epoch值为2

output3:
Please use Model.fit, which supports generators.

Epoch 1/100
71/71 [==============================] - 221s 3s/step - loss: 1.7205 - acc: 0.2553 - val_loss: 1.6083 - val_acc: 0.3750
Epoch 2/100
71/71 [==============================] - 121s 2s/step - loss: 1.5227 - acc: 0.3803 - val_loss: 1.4301 - val_acc: 0.4464
Epoch 3/100
71/71 [==============================] - 123s 2s/step - loss: 1.4023 - acc: 0.4428 - val_loss: 1.5464 - val_acc: 0.4286
Epoch 4/100
71/71 [==============================] - 118s 2s/step - loss: 1.3962 - acc: 0.4489 - val_loss: 1.4306 - val_acc: 0.4286
Epoch 5/100
71/71 [==============================] - 116s 2s/step - loss: 1.3639 - acc: 0.4384 - val_loss: 1.3748 - val_acc: 0.4196
Epoch 6/100
71/71 [==============================] - 118s 2s/step - loss: 1.2870 - acc: 0.4850 - val_loss: 1.2453 - val_acc: 0.5536
Epoch 7/100
71/71 [==============================] - 117s 2s/step - loss: 1.2601 - acc: 0.5133 - val_loss: 1.4683 - val_acc: 0.4464
Epoch 8/100
71/71 [==============================] - 117s 2s/step - loss: 1.2186 - acc: 0.5088 - val_loss: 1.2113 - val_acc: 0.5000
Epoch 9/100
71/71 [==============================] - 115s 2s/step - loss: 1.1850 - acc: 0.5214 - val_loss: 1.3347 - val_acc: 0.4464
Epoch 10/100
71/71 [==============================] - 122s 2s/step - loss: 1.1420 - acc: 0.5423 - val_loss: 1.2093 - val_acc: 0.5536
Epoch 11/100
71/71 [==============================] - 112s 2s/step - loss: 1.0990 - acc: 0.5678 - val_loss: 1.1321 - val_acc: 0.5089
Epoch 12/100
71/71 [==============================] - 112s 2s/step - loss: 1.0840 - acc: 0.5721 - val_loss: 1.1863 - val_acc: 0.5357
Epoch 13/100
71/71 [==============================] - 155s 2s/step - loss: 1.0766 - acc: 0.5979 - val_loss: 1.4430 - val_acc: 0.4554
Epoch 14/100
71/71 [==============================] - 169s 2s/step - loss: 0.9695 - acc: 0.6338 - val_loss: 0.9983 - val_acc: 0.6518
Epoch 15/100
71/71 [==============================] - 119s 2s/step - loss: 0.9901 - acc: 0.6294 - val_loss: 1.1473 - val_acc: 0.5625
Epoch 16/100
71/71 [==============================] - 115s 2s/step - loss: 1.0062 - acc: 0.6406 - val_loss: 1.0303 - val_acc: 0.6250
Epoch 17/100
71/71 [==============================] - 114s 2s/step - loss: 0.9439 - acc: 0.6397 - val_loss: 1.0116 - val_acc: 0.5714
Epoch 18/100
71/71 [==============================] - 116s 2s/step - loss: 0.9797 - acc: 0.6391 - val_loss: 1.1799 - val_acc: 0.5268
Epoch 19/100
71/71 [==============================] - 115s 2s/step - loss: 0.9340 - acc: 0.6459 - val_loss: 1.0967 - val_acc: 0.5804
Epoch 20/100
71/71 [==============================] - 114s 2s/step - loss: 0.8780 - acc: 0.6708 - val_loss: 1.0752 - val_acc: 0.5804
Epoch 21/100
71/71 [==============================] - 114s 2s/step - loss: 0.8546 - acc: 0.6824 - val_loss: 1.1991 - val_acc: 0.5714
Epoch 22/100
71/71 [==============================] - 115s 2s/step - loss: 0.8694 - acc: 0.6628 - val_loss: 1.2398 - val_acc: 0.5357
Epoch 23/100
71/71 [==============================] - 117s 2s/step - loss: 0.8411 - acc: 0.6901 - val_loss: 1.1025 - val_acc: 0.6786
Epoch 24/100
71/71 [==============================] - 117s 2s/step - loss: 0.8107 - acc: 0.7130 - val_loss: 1.1774 - val_acc: 0.5536
Epoch 25/100
71/71 [==============================] - 116s 2s/step - loss: 0.8752 - acc: 0.6673 - val_loss: 0.8081 - val_acc: 0.6696
Epoch 26/100
71/71 [==============================] - 111s 2s/step - loss: 0.8150 - acc: 0.7020 - val_loss: 0.9926 - val_acc: 0.6518
Epoch 27/100
71/71 [==============================] - 113s 2s/step - loss: 0.7882 - acc: 0.7104 - val_loss: 0.9890 - val_acc: 0.6339
Epoch 28/100
71/71 [==============================] - 113s 2s/step - loss: 0.7705 - acc: 0.7201 - val_loss: 1.0953 - val_acc: 0.6071
Epoch 29/100
71/71 [==============================] - 112s 2s/step - loss: 0.7430 - acc: 0.7377 - val_loss: 0.8792 - val_acc: 0.6429
Epoch 30/100
71/71 [==============================] - 112s 2s/step - loss: 0.7626 - acc: 0.7210 - val_loss: 0.8883 - val_acc: 0.6518
Epoch 31/100
71/71 [==============================] - 113s 2s/step - loss: 0.8552 - acc: 0.6815 - val_loss: 1.3025 - val_acc: 0.4732
Epoch 32/100
71/71 [==============================] - 112s 2s/step - loss: 0.7941 - acc: 0.7069 - val_loss: 0.8129 - val_acc: 0.7054
Epoch 33/100
71/71 [==============================] - 109s 2s/step - loss: 0.7429 - acc: 0.7313 - val_loss: 0.8716 - val_acc: 0.6696
Epoch 34/100
71/71 [==============================] - 112s 2s/step - loss: 0.6959 - acc: 0.7536 - val_loss: 1.0984 - val_acc: 0.6250
Epoch 35/100
71/71 [==============================] - 111s 2s/step - loss: 0.7375 - acc: 0.7394 - val_loss: 0.8002 - val_acc: 0.6786
Epoch 36/100
71/71 [==============================] - 111s 2s/step - loss: 0.7072 - acc: 0.7333 - val_loss: 0.7551 - val_acc: 0.7321
Epoch 37/100
71/71 [==============================] - 113s 2s/step - loss: 0.7440 - acc: 0.7403 - val_loss: 1.1043 - val_acc: 0.5982
Epoch 38/100
71/71 [==============================] - 111s 2s/step - loss: 0.7527 - acc: 0.7174 - val_loss: 0.8664 - val_acc: 0.6964
Epoch 39/100
71/71 [==============================] - 111s 2s/step - loss: 0.6643 - acc: 0.7473 - val_loss: 0.8213 - val_acc: 0.7232
Epoch 40/100
71/71 [==============================] - 112s 2s/step - loss: 0.7021 - acc: 0.7456 - val_loss: 0.8613 - val_acc: 0.7143
Epoch 41/100
71/71 [==============================] - 113s 2s/step - loss: 0.6386 - acc: 0.7720 - val_loss: 1.0223 - val_acc: 0.6250
Epoch 42/100
71/71 [==============================] - 111s 2s/step - loss: 0.6568 - acc: 0.7447 - val_loss: 0.9515 - val_acc: 0.6786
Epoch 43/100
71/71 [==============================] - 112s 2s/step - loss: 0.6394 - acc: 0.7852 - val_loss: 0.7870 - val_acc: 0.6786
Epoch 44/100
71/71 [==============================] - 113s 2s/step - loss: 0.6437 - acc: 0.7676 - val_loss: 0.8471 - val_acc: 0.6696
Epoch 45/100
71/71 [==============================] - 111s 2s/step - loss: 0.6337 - acc: 0.7623 - val_loss: 0.8950 - val_acc: 0.6964
Epoch 46/100
71/71 [==============================] - 111s 2s/step - loss: 0.5993 - acc: 0.7820 - val_loss: 1.0203 - val_acc: 0.5625
Epoch 47/100
71/71 [==============================] - 117s 2s/step - loss: 0.6087 - acc: 0.7799 - val_loss: 1.0065 - val_acc: 0.5982
Epoch 48/100
71/71 [==============================] - 138s 2s/step - loss: 0.6285 - acc: 0.7722 - val_loss: 0.8781 - val_acc: 0.6875
Epoch 49/100
71/71 [==============================] - 121s 2s/step - loss: 0.5612 - acc: 0.7984 - val_loss: 1.2335 - val_acc: 0.6429
Epoch 50/100
71/71 [==============================] - 116s 2s/step - loss: 0.5699 - acc: 0.7984 - val_loss: 0.9054 - val_acc: 0.6607
Epoch 51/100
71/71 [==============================] - 110s 2s/step - loss: 0.6030 - acc: 0.7909 - val_loss: 0.9821 - val_acc: 0.6429
Epoch 52/100
71/71 [==============================] - 111s 2s/step - loss: 0.6945 - acc: 0.7412 - val_loss: 0.8685 - val_acc: 0.6786
Epoch 53/100
71/71 [==============================] - 113s 2s/step - loss: 0.5679 - acc: 0.7905 - val_loss: 0.8510 - val_acc: 0.6875
Epoch 54/100
71/71 [==============================] - 113s 2s/step - loss: 0.6023 - acc: 0.7835 - val_loss: 0.8247 - val_acc: 0.6518
Epoch 55/100
71/71 [==============================] - 112s 2s/step - loss: 0.5590 - acc: 0.7896 - val_loss: 0.7802 - val_acc: 0.7321
Epoch 56/100
71/71 [==============================] - 114s 2s/step - loss: 0.5679 - acc: 0.8028 - val_loss: 0.7660 - val_acc: 0.6964
Epoch 57/100
71/71 [==============================] - 110s 2s/step - loss: 0.5839 - acc: 0.8028 - val_loss: 0.7611 - val_acc: 0.7321
Epoch 58/100
71/71 [==============================] - 111s 2s/step - loss: 0.5590 - acc: 0.7967 - val_loss: 1.0786 - val_acc: 0.6071
Epoch 59/100
71/71 [==============================] - 116s 2s/step - loss: 0.5194 - acc: 0.8275 - val_loss: 0.7342 - val_acc: 0.7321
Epoch 60/100
71/71 [==============================] - 110s 2s/step - loss: 0.4677 - acc: 0.8185 - val_loss: 0.9167 - val_acc: 0.6786
Epoch 61/100
71/71 [==============================] - 109s 2s/step - loss: 0.4906 - acc: 0.8052 - val_loss: 0.7638 - val_acc: 0.7321
Epoch 62/100
71/71 [==============================] - 112s 2s/step - loss: 0.5267 - acc: 0.8081 - val_loss: 1.2296 - val_acc: 0.5982
Epoch 63/100
71/71 [==============================] - 111s 2s/step - loss: 0.5880 - acc: 0.7909 - val_loss: 0.8299 - val_acc: 0.6964
Epoch 64/100
71/71 [==============================] - 110s 2s/step - loss: 0.5203 - acc: 0.8203 - val_loss: 0.6984 - val_acc: 0.7589
Epoch 65/100
71/71 [==============================] - 110s 2s/step - loss: 0.5617 - acc: 0.8007 - val_loss: 0.8506 - val_acc: 0.6964
Epoch 66/100
71/71 [==============================] - 112s 2s/step - loss: 0.4157 - acc: 0.8530 - val_loss: 0.9649 - val_acc: 0.6786
Epoch 67/100
71/71 [==============================] - 110s 2s/step - loss: 0.4726 - acc: 0.8363 - val_loss: 0.7467 - val_acc: 0.7411
Epoch 68/100
71/71 [==============================] - 115s 2s/step - loss: 0.4825 - acc: 0.8247 - val_loss: 0.9306 - val_acc: 0.6964
Epoch 69/100
71/71 [==============================] - 110s 2s/step - loss: 0.4757 - acc: 0.8363 - val_loss: 1.1517 - val_acc: 0.6161
Epoch 70/100
71/71 [==============================] - 111s 2s/step - loss: 0.4948 - acc: 0.8185 - val_loss: 0.8304 - val_acc: 0.7054
Epoch 71/100
71/71 [==============================] - 110s 2s/step - loss: 0.5174 - acc: 0.8096 - val_loss: 0.8679 - val_acc: 0.6518
Epoch 72/100
71/71 [==============================] - 111s 2s/step - loss: 0.4799 - acc: 0.8274 - val_loss: 0.8524 - val_acc: 0.7321
Epoch 73/100
71/71 [==============================] - 112s 2s/step - loss: 0.4212 - acc: 0.8495 - val_loss: 1.0715 - val_acc: 0.6875
Epoch 74/100
71/71 [==============================] - 111s 2s/step - loss: 0.5003 - acc: 0.8078 - val_loss: 0.8279 - val_acc: 0.7143
Epoch 75/100
71/71 [==============================] - 111s 2s/step - loss: 0.4267 - acc: 0.8425 - val_loss: 0.7447 - val_acc: 0.7500
Epoch 76/100
71/71 [==============================] - 111s 2s/step - loss: 0.4268 - acc: 0.8371 - val_loss: 0.8244 - val_acc: 0.7500
Epoch 77/100
71/71 [==============================] - 111s 2s/step - loss: 0.4720 - acc: 0.8247 - val_loss: 0.8961 - val_acc: 0.6786
Epoch 78/100
71/71 [==============================] - 112s 2s/step - loss: 0.4979 - acc: 0.8204 - val_loss: 0.8691 - val_acc: 0.6429
Epoch 79/100
71/71 [==============================] - 111s 2s/step - loss: 0.4445 - acc: 0.8461 - val_loss: 1.0964 - val_acc: 0.5982
Epoch 80/100
71/71 [==============================] - 112s 2s/step - loss: 0.4660 - acc: 0.8283 - val_loss: 0.9248 - val_acc: 0.6607
Epoch 81/100
71/71 [==============================] - 111s 2s/step - loss: 0.4824 - acc: 0.8222 - val_loss: 1.2059 - val_acc: 0.6339
Epoch 82/100
71/71 [==============================] - 110s 2s/step - loss: 0.4382 - acc: 0.8354 - val_loss: 0.8243 - val_acc: 0.6875
Epoch 83/100
71/71 [==============================] - 111s 2s/step - loss: 0.3791 - acc: 0.8603 - val_loss: 1.3547 - val_acc: 0.5804
Epoch 84/100
71/71 [==============================] - 112s 2s/step - loss: 0.4175 - acc: 0.8468 - val_loss: 1.1149 - val_acc: 0.7321
Epoch 85/100
71/71 [==============================] - 113s 2s/step - loss: 0.6471 - acc: 0.7740 - val_loss: 1.0958 - val_acc: 0.6250
Epoch 86/100
71/71 [==============================] - 113s 2s/step - loss: 0.4434 - acc: 0.8504 - val_loss: 0.8250 - val_acc: 0.6696
Epoch 87/100
71/71 [==============================] - 112s 2s/step - loss: 0.3719 - acc: 0.8559 - val_loss: 0.8524 - val_acc: 0.7589
Epoch 88/100
71/71 [==============================] - 109s 2s/step - loss: 0.3978 - acc: 0.8532 - val_loss: 0.8410 - val_acc: 0.7321
Epoch 89/100
71/71 [==============================] - 111s 2s/step - loss: 0.4387 - acc: 0.8398 - val_loss: 0.8426 - val_acc: 0.7232
Epoch 90/100
71/71 [==============================] - 110s 2s/step - loss: 0.4056 - acc: 0.8594 - val_loss: 0.8563 - val_acc: 0.7232
Epoch 91/100
71/71 [==============================] - 111s 2s/step - loss: 0.3897 - acc: 0.8592 - val_loss: 0.7448 - val_acc: 0.7321
Epoch 92/100
71/71 [==============================] - 110s 2s/step - loss: 0.3947 - acc: 0.8541 - val_loss: 0.7799 - val_acc: 0.7321
Epoch 93/100
71/71 [==============================] - 109s 2s/step - loss: 0.4416 - acc: 0.8488 - val_loss: 0.9649 - val_acc: 0.6518
Epoch 94/100
71/71 [==============================] - 116s 2s/step - loss: 0.3962 - acc: 0.8550 - val_loss: 1.2210 - val_acc: 0.6607
Epoch 95/100
71/71 [==============================] - 124s 2s/step - loss: 0.4087 - acc: 0.8577 - val_loss: 1.0710 - val_acc: 0.6607
Epoch 96/100
71/71 [==============================] - 117s 2s/step - loss: 0.3748 - acc: 0.8671 - val_loss: 0.8149 - val_acc: 0.7589
Epoch 97/100
71/71 [==============================] - 114s 2s/step - loss: 0.3882 - acc: 0.8550 - val_loss: 1.1649 - val_acc: 0.6875
Epoch 98/100
71/71 [==============================] - 115s 2s/step - loss: 0.3485 - acc: 0.8719 - val_loss: 0.9793 - val_acc: 0.6786
Epoch 99/100
71/71 [==============================] - 118s 2s/step - loss: 0.4128 - acc: 0.8477 - val_loss: 1.0489 - val_acc: 0.6964
Epoch 100/100
71/71 [==============================] - 115s 2s/step - loss: 0.3668 - acc: 0.8644 - val_loss: 1.1848 - val_acc: 0.6250



结果展示 下面我们随机抽取validation中的16张图片,展示图片以及其标签,并且给予我们的预测。 我们发现预测的准确度还是蛮高的,对于大部分图片,都能识别出其类别。

test_x, test_y = validation_generator.__getitem__(1)

preds = model.predict(test_x)

plt.figure(figsize=(16, 16))
for i in range(16):
    plt.subplot(4, 4, i+1)
    plt.title('pred:%s / truth:%s' % (labels[np.argmax(preds[i])], labels[np.argmax(test_y[i])]))
    plt.imshow(test_x[i])

output4:

image.png

实际例子代码下载:https://github.com/wennaz/Deep_Learning

最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
  • 序言:七十年代末,一起剥皮案震惊了整个滨河市,随后出现的几起案子,更是在滨河造成了极大的恐慌,老刑警刘岩,带你破解...
    沈念sama阅读 159,117评论 4 362
  • 序言:滨河连续发生了三起死亡事件,死亡现场离奇诡异,居然都是意外死亡,警方通过查阅死者的电脑和手机,发现死者居然都...
    沈念sama阅读 67,328评论 1 293
  • 文/潘晓璐 我一进店门,熙熙楼的掌柜王于贵愁眉苦脸地迎上来,“玉大人,你说我怎么就摊上这事。” “怎么了?”我有些...
    开封第一讲书人阅读 108,839评论 0 243
  • 文/不坏的土叔 我叫张陵,是天一观的道长。 经常有香客问我,道长,这世上最难降的妖魔是什么? 我笑而不...
    开封第一讲书人阅读 44,007评论 0 206
  • 正文 为了忘掉前任,我火速办了婚礼,结果婚礼上,老公的妹妹穿的比我还像新娘。我一直安慰自己,他们只是感情好,可当我...
    茶点故事阅读 52,384评论 3 287
  • 文/花漫 我一把揭开白布。 她就那样静静地躺着,像睡着了一般。 火红的嫁衣衬着肌肤如雪。 梳的纹丝不乱的头发上,一...
    开封第一讲书人阅读 40,629评论 1 219
  • 那天,我揣着相机与录音,去河边找鬼。 笑死,一个胖子当着我的面吹牛,可吹牛的内容都是我干的。 我是一名探鬼主播,决...
    沈念sama阅读 31,880评论 2 313
  • 文/苍兰香墨 我猛地睁开眼,长吁一口气:“原来是场噩梦啊……” “哼!你这毒妇竟也来了?” 一声冷哼从身侧响起,我...
    开封第一讲书人阅读 30,593评论 0 198
  • 序言:老挝万荣一对情侣失踪,失踪者是张志新(化名)和其女友刘颖,没想到半个月后,有当地人在树林里发现了一具尸体,经...
    沈念sama阅读 34,313评论 1 243
  • 正文 独居荒郊野岭守林人离奇死亡,尸身上长有42处带血的脓包…… 初始之章·张勋 以下内容为张勋视角 年9月15日...
    茶点故事阅读 30,575评论 2 246
  • 正文 我和宋清朗相恋三年,在试婚纱的时候发现自己被绿了。 大学时的朋友给我发了我未婚夫和他白月光在一起吃饭的照片。...
    茶点故事阅读 32,066评论 1 260
  • 序言:一个原本活蹦乱跳的男人离奇死亡,死状恐怖,灵堂内的尸体忽然破棺而出,到底是诈尸还是另有隐情,我是刑警宁泽,带...
    沈念sama阅读 28,392评论 2 253
  • 正文 年R本政府宣布,位于F岛的核电站,受9级特大地震影响,放射性物质发生泄漏。R本人自食恶果不足惜,却给世界环境...
    茶点故事阅读 33,052评论 3 236
  • 文/蒙蒙 一、第九天 我趴在偏房一处隐蔽的房顶上张望。 院中可真热闹,春花似锦、人声如沸。这庄子的主人今日做“春日...
    开封第一讲书人阅读 26,082评论 0 8
  • 文/苍兰香墨 我抬头看了看天上的太阳。三九已至,却和暖如春,着一层夹袄步出监牢的瞬间,已是汗流浃背。 一阵脚步声响...
    开封第一讲书人阅读 26,844评论 0 195
  • 我被黑心中介骗来泰国打工, 没想到刚下飞机就差点儿被人妖公主榨干…… 1. 我叫王不留,地道东北人。 一个月前我还...
    沈念sama阅读 35,662评论 2 274
  • 正文 我出身青楼,却偏偏与公主长得像,于是被迫代替她去往敌国和亲。 传闻我的和亲对象是个残疾皇子,可洞房花烛夜当晚...
    茶点故事阅读 35,575评论 2 270

推荐阅读更多精彩内容