Integer

An integer is a whole number (not a fraction) that can be positive, negative, or zero. Therefore, the numbers 10, 0, -25, and 5,148 are all integers. Unlike floating point numbers, integers cannot have decimal places.

Integers are a commonly used data type in computer programming. For example, whenever a number is being incremented, such as within a "for loop" or "while loop," an integer is used. Integers are also used to determine an item's location within an array.

When two integers are added, subtracted, or multiplied, the result is also an integer. However, when one integer is divided into another, the result may be an integer or a fraction. For example, 6 divided by 3 equals 2, which is an integer, but 6 divided by 4 equals 1.5, which contains a fraction. Decimal numbers may either be rounded or truncated to produce an integer result.

Updated in 2006 by Per C.

quizTest Your Knowledge

Which of the following typically has a username and password?

A
Word document
0%
B
System preference
0%
C
User account
0%
D
Browser extension
0%
Correct! Incorrect!     View the User Account definition.
More Quizzes →

The Tech Terms Computer Dictionary

The definition of Integer on this page is an original definition written by the TechTerms.com team. If you would like to reference this page or cite this definition, please use the green citation links above.

The goal of TechTerms.com is to explain computer terminology in a way that is easy to understand. We strive for simplicity and accuracy with every definition we publish. If you have feedback about this definition or would like to suggest a new technical term, please contact us.

Sign up for the free TechTerms Newsletter

How often would you like to receive an email?

You can unsubscribe or change your frequency setting at any time using the links available in each email.

Questions? Please contact us.