视频融合

项目中有使用到视频融合,记录实现过程。

视频融合简单来讲就是在地图中接入一个视频。视频如地图中的要素一般。

效果图如下:

image-20201124103009147

实现基础

ArcGIS JS API 提供了一个类,externalRenderers,可以在地图中接入第三方的WebGL(原生WebGL,或者是WebGL的库,如three)。

three中可以轻松的实现,用WebGL加载视频。

实现思路

分两步,第一步:在three中实现加载视频;第二步:把写好的three程序集成到地图中。

实现过程

这是第一次接触three,所以照官网写了一个入门例子,一个旋转的正方体。代码放在附件(附1)中。

image-20201124103927627

需要理解three中的一些概念:

  • Scene:场景
  • Camera:相机
  • Geometry:图形(骨架)
  • Material:材质(衣服)
  • Mesh:三维模型(骨架+衣服)
  • Light:光源

three提供了一个类,VideoTexture,可以将视频作为纹理贴图到三维模型上。代码放在附件(附2)中。

image-20201124104810141

通过externalRenderers 将three程序集成到地图中。代码放在附件(附3)中。

参数说明写到了代码注释中。

每当地图视图更新时,会进入到externalRenderer 的render函数进行渲染。如果是three中模型的动画,则要放到这个函数中处理,添加了一个three动画的示例代码,放在附件(附4)中。

image-20201124140554690

附件

注:THREE-123dev,ArcGIS JS API-4.16

附1:旋转的正方体

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
<!DOCTYPE html>
<html>

<head>
<title>My first three.js app</title>
<style>
body {
margin: 0;
}

canvas {
display: block;
}
</style>
</head>

<body>
<script src="./three.js"></script>
<script>
// 创建scene和camera
const scene = new THREE.Scene();
const camera = new THREE.PerspectiveCamera(75, window.innerWidth / window.innerHeight, 0.1, 1000);

// 创建renderer,并添加到页面上
const renderer = new THREE.WebGLRenderer();
renderer.setSize(window.innerWidth, window.innerHeight);
document.body.appendChild(renderer.domElement);

// 创建形状和材质
const geometry = new THREE.BoxGeometry();
const material = new THREE.MeshBasicMaterial({
color: 0x00ff00
});
// 创建三维模型
const cube = new THREE.Mesh(geometry, material);
scene.add(cube);

camera.position.z = 5;

const animate = function () {
requestAnimationFrame(animate);

cube.rotation.x += 0.01;
cube.rotation.y += 0.01;

renderer.render(scene, camera);
};

animate();
</script>
</body>

</html>

附2:视频贴图

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
<!DOCTYPE html>
<html>

<head>
<title>video in THREE</title>
<style>
body {
margin: 0;
}

canvas {
display: block;
}

video {
position: absolute;
left: 10px;
bottom: 10px;
width: 300px;
height: 300px;
}
</style>
</head>

<body>
<script src="./three.js"></script>
<script>
// 创建scene和camera
const scene = new THREE.Scene();
const camera = new THREE.PerspectiveCamera(75, window.innerWidth / window.innerHeight, 0.1, 1000);


// 创建renderer,并添加到页面上
const renderer = new THREE.WebGLRenderer();
renderer.setSize(window.innerWidth, window.innerHeight);
document.body.appendChild(renderer.domElement);

// 创建视频
const video = document.createElement("video");
video.crossOrigin = "anonymous";
video.loop = true;
video.muted = true;
video.src = './video.mp4';
video.play();

// 创建视频贴纸
const texture = new THREE.VideoTexture(video);
texture.minFilter = THREE.LinearFilter;
texture.magFilter = THREE.LinearFilter;
texture.format = THREE.RGBFormat;
texture.generateMipmaps = false;

// 创建形状和材质
const geometry = new THREE.BoxGeometry(); // 六面体
const materialArray = new Array(6).fill(new THREE.MeshBasicMaterial({
map: texture
}));

// 创建三维模型
const cube = new THREE.Mesh(geometry, materialArray);
scene.add(cube);

camera.position.z = 5;

const animate = function () {
if (video.readyState === video.HAVE_ENOUGH_DATA) {
if (texture) {
texture.needsUpdate = true;
}
}
requestAnimationFrame(animate);
renderer.render(scene, camera);
};

animate();
</script>
</body>

</html>

附3:视频融合

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
<html>

<head>
<meta charset="utf-8">
<meta name="viewport" content="initial-scale=1, maximum-scale=1, user-scalable=no">
<title>Use three.js from an external renderer</title>
<style>
html,
body,
#viewDiv {
padding: 0;
margin: 0;
height: 100%;
width: 100%;
}
</style>

<link rel="stylesheet" type="text/css" href="../jsapi/4.16/esri/css/main.css">
<!-- three要在ArcGIS 之前,不然会报错 multipleDefine -->
<script type="text/javascript" src="./three.js"></script>
<script type="text/javascript" src="../jsapi/4.16/dojo/dojo.js"></script>

<script>
require([
"esri/Map",
"esri/views/SceneView",
"esri/views/3d/externalRenderers",
"esri/geometry/SpatialReference",
], function (
EsriMap,
SceneView,
externalRenderers,
SpatialReference,
) {
// Create a map
//////////////////////////////////////////////////////////////////////////////////////
var map = new EsriMap({
basemap: "gray",
ground: "world-elevation"
});

// Create a SceneView
//////////////////////////////////////////////////////////////////////////////////////
var view = new SceneView({
container: "viewDiv",
map: map,
viewingMode: "global",
camera: {
position: {
x: 12979966.298703134,
y: 4884427.262489007,
z: 2548.834649768658,
spatialReference: {
wkid: 102100
}
},
heading: 352.94663895458274,
tilt: 48.19183381946247
}
});

const geoData = [116.5968704223633, 40.15447407601198, 100.2];

// Disable lighting based on the current camera position.
// We want to display the lighting according to the current time of day.
// view.environment.lighting.cameraTrackingEnabled = false;

// Create our custom external renderer
//////////////////////////////////////////////////////////////////////////////////////

const externalRenderer = {
renderer: null, // three.js renderer
camera: null, // three.js camera
scene: null, // three.js scene
ambient: null, // three.js ambient light source
sun: null, // three.js sun light source

/**
* Setup function, called once by the ArcGIS JS API.
*/
setup: function (context) {
// initialize the three.js renderer
//////////////////////////////////////////////////////////////////////////////////////
this.renderer = new THREE.WebGLRenderer({
context: context.gl,
premultipliedAlpha: false,
});
this.renderer.setPixelRatio(window.devicePixelRatio);
this.renderer.setViewport(0, 0, view.width, view.height);

// prevent three.js from clearing the buffers provided by the ArcGIS JS API.
this.renderer.autoClear = false;

// The ArcGIS JS API renders to custom offscreen buffers, and not to the default framebuffers.
// We have to inject this bit of code into the three.js runtime in order for it to bind those
// buffers instead of the default ones.
var originalSetRenderTarget = this.renderer.setRenderTarget.bind(
this.renderer
);
this.renderer.setRenderTarget = function (target) {
originalSetRenderTarget(target);
if (target == null) {
context.bindRenderTarget();
}
};

// setup the three.js scene
///////////////////////////////////////////////////////////////////////////////////////

this.scene = new THREE.Scene();
// 添加辅助坐标轴,当只有一个三维模型的时候必须添加,不明原因
this.scene.add(new THREE.AxesHelper(9000000));
// setup the camera
this.camera = new THREE.PerspectiveCamera();

// setup scene lighting
this.ambient = new THREE.AmbientLight(0xffffff, 0.5);
this.scene.add(this.ambient);
this.sun = new THREE.DirectionalLight(0xffffff, 0.5);
this.scene.add(this.sun);

// 创建面
var geometry = new THREE.PlaneGeometry(500, 500); // 矩形平面

// 创建视频
const video = document.createElement("video");
video.crossOrigin = "anonymous";
video.loop = true;
video.muted = true;
video.src = './video.mp4';
video.play();

var texture = new THREE.VideoTexture(video);
texture.minFilter = THREE.LinearFilter;
texture.magFilter = THREE.LinearFilter;
texture.format = THREE.RGBFormat;
var material = new THREE.MeshPhongMaterial({
map: texture, // 设置纹理贴图
side: THREE.DoubleSide
});
// 创建网格对象
var meshObj = new THREE.Mesh(geometry, material);
// 坐标转换
var transform = new THREE.Matrix4();
transform.fromArray(
externalRenderers.renderCoordinateTransformAt(
view,
geoData,
SpatialReference.WGS84,
new Array(16)
)
);
// A reference to an array where the 16 matrix elements will be stored.
// The resulting matrix follows OpenGL conventions where the translation components occupy
// the 13th, 14th and 15th elements. If undefined, a newly created matrix returned.
meshObj.position.x = transform.elements[12];
meshObj.position.y = transform.elements[13];
meshObj.position.z = transform.elements[14];
// 旋转网格对象
meshObj.rotation.z = -Math.asin(
Math.cos((geoData[1] / 180) * Math.PI) *
Math.cos((geoData[0] / 180) * Math.PI)
);
meshObj.rotation.x = Math.atan(
Math.tan((geoData[1] / 180) * Math.PI) /
Math.sin((geoData[0] / 180) * Math.PI)
);
this.scene.add(meshObj);

// cleanup after ourselves
context.resetWebGLState();
},

render: function (context) {
// update camera parameters
///////////////////////////////////////////////////////////////////////////////////
var cam = context.camera;

this.camera.position.set(cam.eye[0], cam.eye[1], cam.eye[2]);
this.camera.up.set(cam.up[0], cam.up[1], cam.up[2]);
this.camera.lookAt(
new THREE.Vector3(cam.center[0], cam.center[1], cam.center[2])
);

// Projection matrix can be copied directly
this.camera.projectionMatrix.fromArray(cam.projectionMatrix);

// update lighting
/////////////////////////////////////////////////////////////////////////////////////////////////////
view.environment.lighting.date = Date.now();

var l = context.sunLight;
this.sun.position.set(l.direction[0], l.direction[1], l.direction[2]);
this.sun.intensity = l.diffuse.intensity;
this.sun.color = new THREE.Color(
l.diffuse.color[0],
l.diffuse.color[1],
l.diffuse.color[2]
);

this.ambient.intensity = l.ambient.intensity;
this.ambient.color = new THREE.Color(
l.ambient.color[0],
l.ambient.color[1],
l.ambient.color[2]
);

// draw the scene
/////////////////////////////////////////////////////////////////////////////////////////////////////
this.renderer.state.reset(); // this.renderer.resetGLState();
this.renderer.render(this.scene, this.camera);

// as we want to smoothly animate the ISS movement, immediately request a re-render
externalRenderers.requestRender(view);

// cleanup
context.resetWebGLState();
},
};

// register the external renderer
externalRenderers.add(view, externalRenderer);
});
</script>
</head>

<body>
<div id="viewDiv"></div>
</body>

</html>

附4:简易动画

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
<html>

<head>
<meta charset="utf-8">
<meta name="viewport" content="initial-scale=1, maximum-scale=1, user-scalable=no">
<title>animate</title>
<style>
html,
body,
#viewDiv {
padding: 0;
margin: 0;
height: 100%;
width: 100%;
}
</style>

<link rel="stylesheet" type="text/css" href="../jsapi/4.16/esri/css/main.css">
<!-- three要在ArcGIS 之前,不然会报错 multipleDefine -->
<script type="text/javascript" src="./three.js"></script>
<script type="text/javascript" src="../jsapi/4.16/dojo/dojo.js"></script>

<script>
require([
"esri/Map",
"esri/views/SceneView",
"esri/views/3d/externalRenderers",
"esri/geometry/SpatialReference",
], function (
EsriMap,
SceneView,
externalRenderers,
SpatialReference,
) {
// Create a map
//////////////////////////////////////////////////////////////////////////////////////
var map = new EsriMap({
basemap: "gray",
ground: "world-elevation"
});

// Create a SceneView
//////////////////////////////////////////////////////////////////////////////////////
var view = new SceneView({
container: "viewDiv",
map: map,
viewingMode: "global",
camera: {
position: {
x: 12979966.298703134,
y: 4884427.262489007,
z: 2548.834649768658,
spatialReference: {
wkid: 102100
}
},
heading: 352.94663895458274,
tilt: 48.19183381946247
}
});
const geoData = [116.5968704223633, 40.15447407601198, 100.2];


const externalRenderer = {
renderer: null, // three.js renderer
camera: null, // three.js camera
scene: null, // three.js scene
ambient: null, // three.js ambient light source
sun: null, // three.js sun light source

meshObj: null, // 缓存三维模型,动画是改变其位置参数

/**
* Setup function, called once by the ArcGIS JS API.
*/
setup: function (context) {
// initialize the three.js renderer
//////////////////////////////////////////////////////////////////////////////////////
this.renderer = new THREE.WebGLRenderer({
context: context.gl,
premultipliedAlpha: false,
});
this.renderer.setPixelRatio(window.devicePixelRatio);
this.renderer.setViewport(0, 0, view.width, view.height);

// prevent three.js from clearing the buffers provided by the ArcGIS JS API.
this.renderer.autoClear = false;

// The ArcGIS JS API renders to custom offscreen buffers, and not to the default framebuffers.
// We have to inject this bit of code into the three.js runtime in order for it to bind those
// buffers instead of the default ones.
var originalSetRenderTarget = this.renderer.setRenderTarget.bind(
this.renderer
);
this.renderer.setRenderTarget = function (target) {
originalSetRenderTarget(target);
if (target == null) {
context.bindRenderTarget();
}
};

// setup the three.js scene
///////////////////////////////////////////////////////////////////////////////////////

this.scene = new THREE.Scene();
// 添加辅助坐标轴,当只有一个三维模型的时候必须添加,不明原因
this.scene.add(new THREE.AxesHelper(9000000));
// setup the camera
this.camera = new THREE.PerspectiveCamera();

// setup scene lighting
this.ambient = new THREE.AmbientLight(0xffffff, 0.5);
this.scene.add(this.ambient);
this.sun = new THREE.DirectionalLight(0xffffff, 0.5);
this.scene.add(this.sun);

// 创建面
var geometry = new THREE.BoxGeometry(500, 500, 500); // 矩形平面
const material = new THREE.MeshBasicMaterial({
color: 0x00ff00
});
// 创建网格对象
var meshObj = new THREE.Mesh(geometry, material);
// 坐标转换
var transform = new THREE.Matrix4();
transform.fromArray(
externalRenderers.renderCoordinateTransformAt(
view,
geoData,
SpatialReference.WGS84,
new Array(16)
)
);
// A reference to an array where the 16 matrix elements will be stored.
// The resulting matrix follows OpenGL conventions where the translation components occupy
// the 13th, 14th and 15th elements. If undefined, a newly created matrix returned.
meshObj.position.x = transform.elements[12];
meshObj.position.y = transform.elements[13];
meshObj.position.z = transform.elements[14];
// 旋转网格对象
meshObj.rotation.z = -Math.asin(
Math.cos((geoData[1] / 180) * Math.PI) *
Math.cos((geoData[0] / 180) * Math.PI)
);
meshObj.rotation.x = Math.atan(
Math.tan((geoData[1] / 180) * Math.PI) /
Math.sin((geoData[0] / 180) * Math.PI)
);
this.meshObj = meshObj;
this.scene.add(meshObj);

// cleanup after ourselves
context.resetWebGLState();
},

render: function (context) {
// update camera parameters
///////////////////////////////////////////////////////////////////////////////////
var cam = context.camera;

this.camera.position.set(cam.eye[0], cam.eye[1], cam.eye[2]);
this.camera.up.set(cam.up[0], cam.up[1], cam.up[2]);
this.camera.lookAt(
new THREE.Vector3(cam.center[0], cam.center[1], cam.center[2])
);

// Projection matrix can be copied directly
this.camera.projectionMatrix.fromArray(cam.projectionMatrix);

// update lighting
/////////////////////////////////////////////////////////////////////////////////////////////////////
view.environment.lighting.date = Date.now();

var l = context.sunLight;
this.sun.position.set(l.direction[0], l.direction[1], l.direction[2]);
this.sun.intensity = l.diffuse.intensity;
this.sun.color = new THREE.Color(
l.diffuse.color[0],
l.diffuse.color[1],
l.diffuse.color[2]
);

this.ambient.intensity = l.ambient.intensity;
this.ambient.color = new THREE.Color(
l.ambient.color[0],
l.ambient.color[1],
l.ambient.color[2]
);

var meshObj = this.meshObj;
meshObj.rotation.z = meshObj.rotation.z + 0.01;
meshObj.rotation.x = meshObj.rotation.x + 0.01;
this.scene.add(meshObj);
// draw the scene
/////////////////////////////////////////////////////////////////////////////////////////////////////
this.renderer.state.reset(); // this.renderer.resetGLState();
this.renderer.render(this.scene, this.camera);

// as we want to smoothly animate the ISS movement, immediately request a re-render
externalRenderers.requestRender(view);

// cleanup
context.resetWebGLState();
},
};

// register the external renderer
externalRenderers.add(view, externalRenderer);
});
</script>
</head>

<body>
<div id="viewDiv"></div>
</body>

</html>