Deep Neural Network-Based Scene Graph Generation for 3D Simulated Indoor Environments


KIPS Transactions on Software and Data Engineering, Vol. 8, No. 5, pp. 205-212, May. 2019
https://doi.org/10.3745/KTSDE.2019.8.5.205, Full Text:
Keywords: Scene Graph, 3D Indoor Environment, deep neural network, AI2-THOR
Abstract

Scene graph is a kind of knowledge graph that represents both objects and their relationships found in a image. This paper proposes a 3D scene graph generation model for three-dimensional indoor environments. An 3D scene graph includes not only object types, their positions and attributes, but also three-dimensional spatial relationships between them, An 3D scene graph can be viewed as a prior knowledge base describing the given environment within that the agent will be deployed later. Therefore, 3D scene graphs can be used in many useful applications, such as visual question answering (VQA) and service robots. This proposed 3D scene graph generation model consists of four sub-networks: object detection network (ObjNet), attribute prediction network (AttNet), transfer network (TransNet), relationship prediction network (RelNet). Conducting several experiments with 3D simulated indoor environments provided by AI2-THOR, we confirmed that the proposed model shows high performance.


Statistics
Show / Hide Statistics

Statistics (Cumulative Counts from September 1st, 2017)
Multiple requests among the same browser session are counted as one view.
If you mouse over a chart, the values of data points will be shown.


Cite this article
[IEEE Style]
D. Shin and I. Kim, "Deep Neural Network-Based Scene Graph Generation for 3D Simulated Indoor Environments," KIPS Transactions on Software and Data Engineering, vol. 8, no. 5, pp. 205-212, 2019. DOI: https://doi.org/10.3745/KTSDE.2019.8.5.205.

[ACM Style]
Donghyeop Shin and Incheol Kim. 2019. Deep Neural Network-Based Scene Graph Generation for 3D Simulated Indoor Environments. KIPS Transactions on Software and Data Engineering, 8, 5, (2019), 205-212. DOI: https://doi.org/10.3745/KTSDE.2019.8.5.205.