
# 场景:一次取回mongodb某表-64万数据s.DocumentTooLarge: BSON document too large (28888095 bytes) -
# the connected server supports BSON document sizes up to 16777216 bytes. 文档太大就无法返回,因此分10次取,每次
# 取回6400条,然后统一更新到本地的字典中。
import uuid
data = [1, 2, 3, 4, 5, 6, 7, 8, 9, 3, 5, 6, 7, 8, 9, 6, 5, 4, 3, 8, 9]
batch_step = round(len(data)/10)
for index in range(0, len(data), batch_step):
item_list = data[index:index+batch_step]
# example
from pymongo import MongoClient
mdb = MongoClient('120.:20002', username='xt', password='xxxxxx')
image_ids = ["001", "002", "003", ...]
image_dict = {}
batch_step = round(len(image_ids)/10)
for idx in range(0, len(image_ids), batch_step):
image_ids_part = image_ids[idx:idx + batch_step]
image_infos = mdb['数据库名']['图片表名'].find({"image_id": {"$in": image_ids_part}})
image_one = {}
for image_info in image_infos:
if ("image_size"):
image_one[("image_id")] = image_info
image_dict.update(image_one)
本文发布于:2024-03-07 23:26:47,感谢您对本站的认可!
本文链接:https://www.4u4v.net/it/1709976733129820.html
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。
| 留言与评论(共有 0 条评论) |