通过 child_process 从 Python 向节点发送 JSON 如果太长会被截断,如何解决?
Sending JSON from Python to Node via child_process gets truncated if too long, how to fix?
我的节点和 Python 后端 运行 很好,但我现在遇到一个问题,如果 JSON 我从 Python 返回没有节点太长了,它被分成两部分,我在节点端的 JSON.parse 失败了。
我应该如何解决这个问题?比如第一批剪辑在
... [1137.6962355826706, -100.78015825640887], [773.3834338399517, -198
第二个还有剩下的几条
.201506231888], [-87276.575065248, -60597.8827676457], [793.1850250453127,
-192.1674702207991], [1139.4465453979683, -100.56741252031816],
[780.498416769341, -196.04064849430705]]}
我是否必须在节点端创建一些长时间 JSONs 的逻辑,或者这是我在 Python 端遇到的某种缓冲问题,我可以通过正确设置?这是我在 python 方面所做的一切:
outPoints, _ = cv2.projectPoints(inPoints, np.asarray(rvec),
np.asarray(tvec), np.asarray(camera_matrix), np.asarray(dist_coeffs))
# flatten the output to get rid of double brackets per result before JSONifying
flattened = [val for sublist in outPoints for val in sublist]
print(json.dumps({'testdata':np.asarray(flattened).tolist()}))
sys.stdout.flush()
在节点端:
// Handle python data from print() function
pythonProcess.stdout.on('data', function (data){
try {
// If JSON handle the data
console.log(JSON.parse(data.toString()));
} catch (e) {
// Otherwise treat as a log entry
console.log(data.toString());
}
});
发出的数据是分块的,所以如果你想解析一个 JSON
你将需要加入所有的块,并在 end
上执行 JSON.parse
.
By default, pipes for stdin, stdout, and stderr are established
between the parent Node.js process and the spawned child. These pipes
have limited (and platform-specific) capacity. If the child process
writes to stdout in excess of that limit without the output being
captured, the child process will block waiting for the pipe buffer to
accept more data.
在 linux 中,每个块限制为 65536
字节。
In Linux versions before 2.6.11, the capacity of a pipe was the same
as the system page size (e.g., 4096 bytes on i386). Since Linux
2.6.11, the pipe capacity is 65536 bytes.
let result = '';
pythonProcess.stdout.on('data', data => {
result += data.toString();
// Or Buffer.concat if you prefer.
});
pythonProcess.stdout.on('end', () => {
try {
// If JSON handle the data
console.log(JSON.parse(result));
} catch (e) {
// Otherwise treat as a log entry
console.log(result);
}
});
我的节点和 Python 后端 运行 很好,但我现在遇到一个问题,如果 JSON 我从 Python 返回没有节点太长了,它被分成两部分,我在节点端的 JSON.parse 失败了。
我应该如何解决这个问题?比如第一批剪辑在
... [1137.6962355826706, -100.78015825640887], [773.3834338399517, -198
第二个还有剩下的几条
.201506231888], [-87276.575065248, -60597.8827676457], [793.1850250453127,
-192.1674702207991], [1139.4465453979683, -100.56741252031816],
[780.498416769341, -196.04064849430705]]}
我是否必须在节点端创建一些长时间 JSONs 的逻辑,或者这是我在 Python 端遇到的某种缓冲问题,我可以通过正确设置?这是我在 python 方面所做的一切:
outPoints, _ = cv2.projectPoints(inPoints, np.asarray(rvec),
np.asarray(tvec), np.asarray(camera_matrix), np.asarray(dist_coeffs))
# flatten the output to get rid of double brackets per result before JSONifying
flattened = [val for sublist in outPoints for val in sublist]
print(json.dumps({'testdata':np.asarray(flattened).tolist()}))
sys.stdout.flush()
在节点端:
// Handle python data from print() function
pythonProcess.stdout.on('data', function (data){
try {
// If JSON handle the data
console.log(JSON.parse(data.toString()));
} catch (e) {
// Otherwise treat as a log entry
console.log(data.toString());
}
});
发出的数据是分块的,所以如果你想解析一个 JSON
你将需要加入所有的块,并在 end
上执行 JSON.parse
.
By default, pipes for stdin, stdout, and stderr are established between the parent Node.js process and the spawned child. These pipes have limited (and platform-specific) capacity. If the child process writes to stdout in excess of that limit without the output being captured, the child process will block waiting for the pipe buffer to accept more data.
在 linux 中,每个块限制为 65536
字节。
In Linux versions before 2.6.11, the capacity of a pipe was the same as the system page size (e.g., 4096 bytes on i386). Since Linux 2.6.11, the pipe capacity is 65536 bytes.
let result = '';
pythonProcess.stdout.on('data', data => {
result += data.toString();
// Or Buffer.concat if you prefer.
});
pythonProcess.stdout.on('end', () => {
try {
// If JSON handle the data
console.log(JSON.parse(result));
} catch (e) {
// Otherwise treat as a log entry
console.log(result);
}
});