Glossary

Accelerometer - measures acceleration; for ARCore-capable smartphones, it helps enable motion tracking (Module 2).

Anchors - user-defined points of interest upon which AR objects are placed. Anchors are created and updated relative to geometry (planes, points, etc.) ARCore detects in the environment (Module 2).

Asset - refers to a 3D model (Module 1).

Augmented Reality (AR) - a direct or indirect live view of a physical, real-world environment whose elements are "augmented" by computer-generated perceptual information (Module 1).

Computer Vision - a blend of artificial intelligence and computer science that aims to enable computers (like smartphones) to visually understand the surrounding world like human vision does (Module 2).

Concurrent Odometry and Mapping (COM) - motion tracking process for ARCore, and tracks the smartphone’s location in relation to its surrounding world (Module 2).

Design Document - a guide for your AR experience that contains all of the 3D assets, sounds, and other design ideas for your team to implement (Module 3).

Drift - refers to the accumulation of potential motion tracking error. If you walk around digital assets too quickly, eventually the device's pose may not reflect where you actually are. ARCore attempts to correct for drift over time and updates Anchors to keep digital objects placed correctly relative to the real world (Module 2).

Edit-time - when edits/changes are made during non-gameplay mode/edit mode and before your application or game is deployed (Module 4).

Environmental understanding - understanding the real world environment by detecting feature points and planes and using them as reference points to map the environment. Also referred to as context awareness (Module 2).

Feature Points - are visually distinct features in your environment, like the edge of a chair, a light switch on a wall, the corner of a rug, or anything else that is likely to stay visible and consistently placed in your environment. ARCore uses feature points in the captured camera image to compute change in location, further environmental understanding, and place planes in an AR app (Module 2).

Framing - with regards to mobile AR design, this is the strategic placement of 3D objects in the environment to avoid breaking immersion (Module 4).

Google Poly - a free repository of 3D assets that can be quickly downloaded and used in your ARCore experience (Module 3).

GPS - global navigation satellite system that provides geolocation and time information; for ARCore-capable smartphones, it helps enable location-based AR apps. (Module 2).

Gyroscope - measures orientation and angular velocity; for ARCore-capable smartphones, it helps enable motion tracking (Module 2).

Hit-testing - used to take an (x,y) coordinate corresponding to the phone's screen (provided by a tap or whatever other interaction you want your app to support) and project a ray into the camera's view of the world. This returns any planes or feature points that the ray intersects, along with the pose of that intersection in world space. This allows users to select or otherwise interact with objects in the environment (Module 4).

HMD - Head-Mounted Display (Module 1).

Immersion - the sense that digital objects belong in the real world. Breaking immersion means that the sense of realism has been broken; in AR this is usually by an object behaving in a way that does not match our expectations (Module 2).

Inside-Out Tracking - when the device has internal cameras and sensors to detect motion and track positioning (Module 2).

Light estimation - allows the phone to estimate the environment's current lighting conditions (Module 2).

Magnetometer - measures cardinal direction and allows ARCore-capable smartphones to auto-rotate digital maps depending physical orientation, which helps enable location-based AR apps (Module 2).

Motion Tracking - in the basic sense, this means tracking the movement of an object in space. ARCore uses your phone's camera, internal gyroscope, and accelerometer to estimate its pose in 3D space in real time (Module 2).

Multi-plane detection - ARCore's ability to detect various surfaces as different height and depth (Module 4).

Occlusion - when one 3D object blocks another 3D object. Currently, this can only happen with digital objects. ARCore objects cannot be occluded by a real world object (Module 2).

Outside-In Tracking - when the device uses external cameras or sensors to detect motion and track positioning (Module 2).

Phone Camera - supplies a live feed of the surrounding real world upon which AR content is overlaid when using mobile AR (Module 2).

Placing - when the tracking of a digital object is fixed, or anchored, to a certain point in the real-world (Module 2).

Plane Finding - the smartphone-specific process by which ARCore determines where horizontal and vertical surfaces are in your environment and uses those surfaces to place and orient digital objects (Module 2).

Pose - the unique position and orientation of any object in relation to the world around it, from your mobile device to the augmented 3D asset that you see on your display (Module 4).

Raycasting - projecting a ray to help estimate where the AR object should be placed in order to appear in the real-world surface in a believable way; used during hit testing (Module 3).

Runtime - when edits/changes are made during active gameplay mode or while your app is running. For example, you can download Poly assets while your application is in gameplay mode or running (Module 4).

Scaling - When a placed AR object changes size and/or dimension relative to the AR device's position; enabled by environment understanding. (Module 2)

SLAM - motion tracking process that tracks the device in relation to its surrounding world (Module 2).

Spatial mapping - the ability to create a 3D map of the environment and helps establish where assets should be posed (Module 4).

Standalone headset - VR or AR headset that does not require external processors, memory, or power (Module 1).

Surface detection - allows ARCore to place digital objects on various surface heights, to render different objects at different sizes and positions, and to create more realistic AR experiences in general (Module 4).

Unity - cross-platform game engine and development environment for both 3D and 2D interactive applications (Module 4).

User Experience (UX) - the process and underlying framework of enhancing user flow to create products with high usability and accessibility for end users (Module 3).

User Flow - the journey of your app's users and how a person will engage, step by step, with your AR experience (Module 3).

User Interface (UI) - the visuals of your app and everything that a user interacts with (Module 3).

User interface metaphor - gives the user instantaneous knowledge about how to interact with the user interface, like a QWERTY keyboard or a computer mouse (Module 2).

Virtual Reality (VR) - the use of computer technology to create a simulated environment, placing the user inside an experience (Module 1).

Last updated