In statistics, signal processing, and econometrics, a time series is a sequence of data points, measured typically at successive times, spaced at (often uniform) time intervals. Time series analysis comprises methods that attempt to understand such time series, often either to understand the underlying theory of the data points (where did they come from? what generated them?), or to make forecasts (predictions). Time series prediction is the use of a model to predict future events based on known past events: to predict future data points before they are measured. The standard example is the opening price of a share of stock based on its past performance.
As shown by Box and Jenkins in their book, models for time series data can have many forms and represents different stochastic processes. When modeling the mean of a process, three broad classes of practical importance are the Autoregressive (AR) models, the Integrated (I) models, and the Moving Average (MA) models (the MA process is related but not to be confused with the concept of Moving average ). These three classes depend linearly on previous data points and are treated in more detail in the articles Autoregressive Moving Average Models (ARMA) and Autoregressive Integrated Moving Average (ARIMA). The Autoregressive Fractionally Integrated Moving Average (ARFIMA) model generalizes the former three. Non-linear dependence on previous data points is of interest because of the possibility of producing a chaotic time series.
Among non-linear time series, there are models to represent the changes of variance along time (heteroskedasticity). These models are called Autoregressive Conditional Heteroskedasticity (ARCH) and the collection comprises a wide variaty of representation (GARCH, TARCH, EGARCH, FIGARCH, CGARCH, etc).