K-Dimensional Tree in Data Structures

K-Dimensional Tree in Data Structures

04 Mar 2024
Advanced
1.01K Views
21 min read
Learn via Video Course & by Doing Hands-on Labs

Data Structures & Algorithms Free Course

K-Dimensional Tree in Data Structures: An Overview

K-dimensional trees (k-d trees) offer an efficient way to quickly store and retrieve information in both two-dimensional and multidimensional spaces. K-d trees can be used in applications ranging from computer graphics to scientific computing, offering increased speed and accuracy when searching large datasets. In this DSA tutorial, we will explore what k-d trees are, the benefits they offer, and where they can be applied for optimal results.

To further enhance your understanding and application of K-Dimensional Tree concepts, consider enrolling in the Best Dsa Course, to gain comprehensive insights into effective data structure utilization for improved problem-solving and time management.

What is a K-Dimensional Tree in Data Structures?

A k-dimensional tree, also known as a k-d tree, is a data structure used for organizing points in a k-dimensional space. It is a binary tree in which nodes represent a point in k-dimensional space. Each node has a key that corresponds to one of the k dimensions, and its left and right child nodes represent points that have smaller and higher values in that dimension, respectively.

The k-d tree is particularly useful when searching for nearest neighbor points in large data sets. By efficiently partitioning the data and pruning irrelevant branches, the k-dimensional tree can greatly reduce the time required to find the nearest neighbors. 

Implementation of a K-Dimensional Tree in Data Structures


import sys

# Number of dimensions
k = 2

# A structure to represent node of kd tree
class Node:
	def __init__(self, point):
		self.point = point
		self.left = None
		self.right = None

# A method to create a node of K D tree
def newNode(point):
	return Node(point)

# Inserts a new node and returns root of modified tree
# The parameter depth is used to decide axis of comparison
def insertRec(root, point, depth):
	# Tree is empty?
	if not root:
		return newNode(point)

	# Calculate current dimension (cd) of comparison
	cd = depth % k

	# Compare the new point with root on current dimension 'cd'
	# and decide the left or right subtree
	if point[cd] < root.point[cd]:
		root.left = insertRec(root.left, point, depth + 1)
	else:
		root.right = insertRec(root.right, point, depth + 1)

	return root

# Function to insert a new point with given point in
# KD Tree and return new root. It mainly uses above recursive
# function "insertRec()"
def insert(root, point):
	return insertRec(root, point, 0)

# A utility method to determine if two Points are same
# in K Dimensional space
def arePointsSame(point1, point2):
	# Compare individual coordinate values
	for i in range(k):
		if point1[i] != point2[i]:
			return False

	return True

# Searches a Point represented by "point[]" in the K D tree.
# The parameter depth is used to determine current axis.
def searchRec(root, point, depth):
	# Base cases
	if not root:
		return False
	if arePointsSame(root.point, point):
		return True

	# Current dimension is computed using current depth and total
	# dimensions (k)
	cd = depth % k

	# Compare point with root with respect to cd (Current dimension)
	if point[cd] < root.point[cd]:
		return searchRec(root.left, point, depth + 1)

	return searchRec(root.right, point, depth + 1)

# Searches a Point in the K D tree. It mainly uses
# searchRec()
def search(root, point):
	# Pass current depth as 0
	return searchRec(root, point, 0)

# Driver program to test above functions
if __name__ == '__main__':
	root = None
	points = [[3, 6], [17, 15], [13, 15], [6, 12], [9, 1], [2, 7], [10, 19]]

	n = len(points)

	for i in range(n):
		root = insert(root, points[i])

	point1 = [10, 19]
	if search(root, point1):
		print("Found")
	else:
		print("Not Found")

	point2 = [12, 19]
	if search(root, point2):
		print("Found")
	else:
		print("Not Found")

import java.util.*;

class Node {
	int[] point;
	Node left, right;

	public Node(int[] arr)
	{
		this.point = arr;
		this.left = this.right = null;
	}
}

class KDtree {
	int k = 2;

	public Node newNode(int[] arr) { return new Node(arr); }

	public Node insertRec(Node root, int[] point, int depth)
	{
		if (root == null) {
			return newNode(point);
		}

		int cd = depth % k;
		if (point[cd] < root.point[cd]) {
			root.left
				= insertRec(root.left, point, depth + 1);
		}
		else {
			root.right
				= insertRec(root.right, point, depth + 1);
		}

		return root;
	}

	public Node insert(Node root, int[] point)
	{
		return insertRec(root, point, 0);
	}

	public boolean arePointsSame(int[] point1, int[] point2)
	{
		for (int i = 0; i < k; ++i) {
			if (point1[i] != point2[i]) {
				return false;
			}
		}
		return true;
	}

	public boolean searchRec(Node root, int[] point,
							int depth)
	{
		if (root == null) {
			return false;
		}
		if (arePointsSame(root.point, point)) {
			return true;
		}

		int cd = depth % k;
		if (point[cd] < root.point[cd]) {
			return searchRec(root.left, point, depth + 1);
		}
		return searchRec(root.right, point, depth + 1);
	}

	public boolean search(Node root, int[] point)
	{
		return searchRec(root, point, 0);
	}

	public static void main(String[] args)
	{
		KDtree kdTree = new KDtree();

		Node root = null;
		int[][] points
			= { { 3, 6 }, { 17, 15 }, { 13, 15 }, { 6, 12 },
				{ 9, 1 }, { 2, 7 }, { 10, 19 } };

		int n = points.length;

		for (int i = 0; i < n; i++) {
			root = kdTree.insert(root, points[i]);
		}

		int[] point1 = { 10, 19 };
		System.out.println(kdTree.search(root, point1)
							? "Found"
							: "Not Found");

		int[] point2 = { 12, 19 };
		System.out.println(kdTree.search(root, point2)
							? "Found"
							: "Not Found");
	}
}

#include <iostream>
using namespace std;

const int k = 2;

// A structure to represent node of kd tree
struct Node
{
	int point[k]; // To store k dimensional point
	Node *left, *right;
};

// A method to create a node of K D tree
struct Node* newNode(int arr[])
{
	struct Node* temp = new Node;

	for (int i=0; ipoint[i] = arr[i];

	temp->left = temp->right = NULL;
	return temp;
}

// Inserts a new node and returns root of modified tree
// The parameter depth is used to decide axis of comparison
Node *insertRec(Node *root, int point[], unsigned depth)
{
	// Tree is empty?
	if (root == NULL)
	return newNode(point);

	// Calculate current dimension (cd) of comparison
	unsigned cd = depth % k;

	// Compare the new point with root on current dimension 'cd'
	// and decide the left or right subtree
	if (point[cd] < (root->point[cd]))
		root->left = insertRec(root->left, point, depth + 1);
	else
		root->right = insertRec(root->right, point, depth + 1);

	return root;
}

// Function to insert a new point with given point in
// KD Tree and return new root. It mainly uses above recursive
// function "insertRec()"
Node* insert(Node *root, int point[])
{
	return insertRec(root, point, 0);
}

// A utility method to determine if two Points are same
// in K Dimensional space
bool arePointsSame(int point1[], int point2[])
{
	// Compare individual pointinate values
	for (int i = 0; i < k; ++i)
		if (point1[i] != point2[i])
			return false;

	return true;
}

// Searches a Point represented by "point[]" in the K D tree.
// The parameter depth is used to determine current axis.
bool searchRec(Node* root, int point[], unsigned depth)
{
	// Base cases
	if (root == NULL)
		return false;
	if (arePointsSame(root->point, point))
		return true;

	// Current dimension is computed using current depth and total
	// dimensions (k)
	unsigned cd = depth % k;

	// Compare point with root with respect to cd (Current dimension)
	if (point[cd] < root->point[cd])
		return searchRec(root->left, point, depth + 1);

	return searchRec(root->right, point, depth + 1);
}

// Searches a Point in the K D tree. It mainly uses
// searchRec()
bool search(Node* root, int point[])
{
	// Pass current depth as 0
	return searchRec(root, point, 0);
}

// Driver program to test above functions
int main()
{
	struct Node *root = NULL;
	int points[][k] = {{3, 6}, {17, 15}, {13, 15}, {6, 12},
					{9, 1}, {2, 7}, {10, 19}};

	int n = sizeof(points)/sizeof(points[0]);

	for (int i=0; i

Output

Found
Not Found

Read More - Data Structure Interview Questions for Freshers

Complexity of K-Dimensional Trees

The complexity of a k-dimensional tree, also known as a kd-tree, depends on the specific operation being performed. There are several common operations associated with kd-trees, each with its complexity.

  1. Construction: Constructing a kd-tree from a set of k-dimensional points is typically performed using a recursive algorithm. The complexity of constructing a k-d tree with n points is O(n log n) on average. However, in the worst case, it can be O(n^2), for example, when the input points are already sorted or nearly sorted.
  2. Searching: Searching for a point in a k-d tree involves traversing the tree in a manner that exploits its hierarchical structure. The complexity of searching in a balanced kd-tree is O(log n), where n is the number of points in the tree. However, in the worst case, when the kd-tree is highly unbalanced, the search complexity can deteriorate to O(n).
  3. Nearest neighbor search: Finding the nearest neighbor to a given query point in a kd-tree requires traversing the tree and pruning branches based on distance calculations. The average complexity of nearest neighbor search in a balanced k-d tree is O(log n), but in the worst case, it can be O(n).
  4. Range search: Performing a range search in a k-d tree involves finding all points within a specified range or bounding box. The complexity of range search depends on the number of points that fall within the range. In the worst case, where all points are within the range, the complexity is O(n). However, on average, the complexity can be much lower, closer to O(sqrt(n)).

Advantages of K Dimension Trees

  1. Efficient search: K-D trees are effective in searching for points in k-dimensional space, such as in nearest neighbor search or range search.
  2. Dimensionality reduction: K-D trees can be used to reduce the dimensionality of the problem, allowing for faster search times and reducing the memory requirements of the data structure.
  3. Versatility: K-D trees can be used for a wide range of applications, such as in data mining, computer graphics, and scientific computing.
  4. Balance: K-d trees are self-balancing, which ensures that the tree remains efficient even when data is inserted or removed.
  5. Incremental construction: K-D trees can be incrementally constructed, which means that data can be added or removed from the structure without having to rebuild the entire tree.
  6. Easy to implement: K-D trees are relatively easy to implement and can be implemented in a variety of programming languages.
Summary

K-Dimensional Trees are an extremely useful data structure. They have numerous applications in fields such as computer vision and machine learning. K-Dimensional trees can help one create a 3D visualization of their data, allowing them to easily organize large amounts of information. Now that you’re familiar with the K-Dimensional tree’s features, capabilities, and advantages, it’s time to put it into action! For that consider enrolling in our Best Data Structures And Algorithms Course.

FAQs

Q1. Is K-Dimensional Tree a Binary Tree?

Yes, it is a binary tree in which nodes represent a point in k-dimensional space.

Q2. What is the time complexity of searching in a balanced K-Dimensional Tree?

The complexity of searching in a balanced kd-tree is O(log n), where n is the number of points in the tree.

Q3. How to construct a K-Dimensional Tree?

Constructing a kd-tree from a set of k-dimensional points is typically performed using a recursive algorithm.
Share Article
Batches Schedule
About Author
Amit Kumar Ghosh (SDE and Mentor)

A passionate professional with over 6 years of experience in development and training. He is passionate about learning new technologies and sharing his experience with professionals. He is an expert in C/C++, Java, Python and DSA.
Self-paced Membership
  • 22+ Video Courses
  • 750+ Hands-On Labs
  • 300+ Quick Notes
  • 55+ Skill Tests
  • 45+ Interview Q&A Courses
  • 10+ Real-world Projects
  • Career Coaching Sessions
  • Email Support
Upto 60% OFF
Know More
Accept cookies & close this