Updated: September 18, 2023 (July 5, 2023)

  Charts & Illustrations

Understanding Embeddings, Vectors, and Vector Databases

My Atlas / Charts & Illustrations

362 wordsTime to read: 2 min
Barry Briggs by
Barry Briggs

Before joining Directions on Microsoft in 2020, Barry worked at Microsoft for 12 years in a variety of roles, including... more

Vector search capabilities, which can accelerate AI applications, are in public preview in Azure Cosmos DB and in private preview for Azure Cognitive Search

The engines that power modern AI applications such as ChatGPT, called large language models (LLMs), process numerical data in order to generate responses. User input, called prompts, are converted into vectors (arrays of numbers, termed “embeddings”). The quality of the response created by the LLM depends significantly upon the ability of the embedding to capture the user’s intent, that is, the semantics of the prompt.

Modern AI applications use machine learning trained on vast quantities of text to create the embedding. These technologies can analyze both the context and the intent (semantics) of the input prompt.

For example, in the input prompt “When might I see a flock of bats emerging from a cave” the system recognizes that the word “bat” in this sentence occurs near other words suggestive of the animal, as opposed to baseball or cricket. In this way the meaning, as illustrated in the diagram on the left, can be clarified. In modern AI applications, the embedding vector (the array of numbers) generated from the prompt typically has hundreds or thousands of values (dimensions).

Atlas Members have full access

Get access to this and thousands of other unbiased analyses, roadmaps, decision kits, infographics, reference guides, and more, all included with membership. Comprehensive access to the most in-depth and unbiased expertise for Microsoft enterprise decision-making is waiting.

Membership Options

Already have an account? Login Now