### High-level optimizations I

**Dead code elimination**

Programmers occasionally write code that can’t be reached by any possible path of execution. For instance checking ‘a = 1’ then changing it and accidentally checking if ‘a = 1’ again. This is dead, unreachable code, it is actually quite uncommon and rarely affects performance or code size. It is found mainly to provide warnings to programmer because it is sometimes associated with bugs. Also code generated automatically often has unreachable parts.

**Arithmetic simplification**

Programmers generally give algebraic expressions in their simplest form, but not always, simplification deals with this. A set of algorithms applies arithmetic identities attempting to find the one with the smallest set of instruction. This is not simple, doing it generally is not possible. For e.g. would you know that:

sqrt(exp(alog(x)/y)) = exp(alog(x)/2*y)

Your compiler probably will, but only because the calculation occurs in the whetstone benchmark, so compiler designers have tuned for it.

**Constant Folding**

Find out at compile time that an expression is a constant for example if you write 2 * PI * x / 4 it will reduce to 1.570796337 * x. This is very useful, especially because every compiler for many years has performed it so programmers know it happens and rely on it, making programs clearer.

**Common sub-expression elimination (CSE)**

Programmers often repeat equations or parts of equations, e.g.

x = sqrt(M) * cos(θ); y = sqrt(M) * sin(θ);

In this case the common subexpression is the “sqrt(M)” part, the CSE stage will store the result of the first sqrt(M) and reuse it instead of recalculating it. Unlike the other two optimizations I have mentioned there is a problem with CSE, it can make a program slower in certain circumstances. In particular when eliminating the duplicated code its result must be stored so it can be reused. In some cases (if it is stored in memory) loading and restoring from memory can take more time than redoing an operation. Problems like this must be accounted for. In general most optimizations run the risk of making things worse. Common subexpression elimination is very important in array address calculations since some parts of the calculations can be reused. You may have heard of GCSE; Global CSE is elimination across basic blocks.

At this point is worth saying a little about the word `global’, which unfortunately has confused meaning. When people speak about `global-optimizations’ they usually mean optimizations of whole programs, this is similar to the idea of global variables. `Global-CSE’ though merely means `over many basic blocks’, as does global register allocation.

**Data flow**

Most optimizations deal with data flow, so code is brought into that form: a directed acyclic graph. Here is the dag of some expressions:-

d = a + b f = c - d f = e / b g = b + a d = f + e

a + b is a common to d = a + b and g = a + b, so instead of creating a new node for it, the node is reused, so common subexpressions are found. Here the code building the dag knows that + is commutative, it may also apply other algebraic rules. f in f = c – d is overwritten by a later assignment and is therefore dead code. So dead code can be found by looking for nodes with no variables left alive.

Pages: « Prev 1 2 3 4 5 6 Next »

Discuss (15 comments)