In this post i will present some of the basic knowledge a computer programmer should have in order to solve specific types of problems. This will increase developers' skills by not only increasing their ability for solving hard
problems, but also help them writing the most efficient code for a
specific solution.

One of the most important aspects of an algorithm is how fast it is. It is often easy to come up with an algorithm to solve a problem, but if the algorithm is too slow, it's back to the drawing board.

Not all problems could be solved using a specific technique, but one can say that a specific category could be solved using a specific technique, however each problem has its own idea that problem solver should customize the technique used to solve the problem.

A divide and conquer algorithm works by recursively breaking down a problem into two or more sub-problems of the same (or related) type, until these become simple enough to be solved directly. The solutions to the sub-problems are then combined to give a solution to the original problem.

This technique is the basis of efficient algorithms for all kinds of problems.

**Agenda**- Introduction

- Runtime analysis
- Program execution time
- Best case vs Average case vs Worst case analysis

- Problem solving techniques
- Complete search
- Divide and conquer
- Dynamic programming
- Greedy
- Backtracking

- Graph algorithms
- Representing a graph
- Array representation
- Adjacency list

- Searching a graph
- Depth first search
- Breadth first search
- Shortest path
- Dijkstra algorithm
- Graph algorithms
- Maximum flow, minimum cut algorithm

- Example problems
- Conclusion
- References

__Introduction__
The first step towards an understanding of why the study and knowledge of algorithms are so important is to define exactly what we mean by an algorithm. According to the popular algorithms textbook Introduction to Algorithms (Second Edition by Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, Clifford Stein), "an algorithm is any well-defined computational procedure that takes some value, or set of values, as input and produces some value, or set of values as output." In other words, algorithms are like road maps for accomplishing a given, well-defined task. So, a chunk of code that calculates the terms of the Fibonacci sequence is an implementation of a particular algorithm. Even a simple function for adding two numbers is an algorithm in a sense, albeit a simple one.

Some algorithms, like those that compute the Fibonacci sequences, are intuitive and may be innately embedded into our logical thinking and problem solving skills. However, for most of us, complex algorithms are best studied so we can use them as building blocks for more efficient logical problem solving in the future. In fact, you may be surprised to learn just how many complex algorithms people use every day when they check their e-mail or listen to music on their computers. This article will introduce some basic ideas related to the analysis of algorithms, and then put these into practice with a few examples illustrating why it is important to know about algorithms

Some algorithms, like those that compute the Fibonacci sequences, are intuitive and may be innately embedded into our logical thinking and problem solving skills. However, for most of us, complex algorithms are best studied so we can use them as building blocks for more efficient logical problem solving in the future. In fact, you may be surprised to learn just how many complex algorithms people use every day when they check their e-mail or listen to music on their computers. This article will introduce some basic ideas related to the analysis of algorithms, and then put these into practice with a few examples illustrating why it is important to know about algorithms

**(Future posts)**.

**Runtime analysis**One of the most important aspects of an algorithm is how fast it is. It is often easy to come up with an algorithm to solve a problem, but if the algorithm is too slow, it's back to the drawing board.

**Program execution time**

Since the exact speed of an algorithm depends on where the algorithm is run, as well as the exact details of its implementation, computer scientists typically talk about the run-time relative to the size of the input. For example, if the input consists of N integers, an algorithm might have a run-time proportional to N^2, represented as O(N^2). This means that if you were to run an implementation of the algorithm on your computer with an input of size N, it would take C*N^2 seconds, where C is some constant that doesn't change with the size of the input.

**Best case vs Average case vs Worst case analysis**

In order to show the difference between these types of algorithms analysis, i will present a simple problem which is linear search. In this problem we take as input an array and a specific element to search for, the output is the index of this element. The actual processing for this algorithm is looping for all the array elements till we find it and return the index, or -1 if reach the end of the array.

- Best case:

If the element exists at the start of the array this will take only one step of searching.

- Average case:

If the element exists at the middle of the array this will take N/2 steps to get a result.

- Worst case:

Obviously searching for a solution that will survive the worst case is always the only way for a developer to survive :).If the element doesn't exist this will take N steps to get a result.

**Problem solving techniques**Not all problems could be solved using a specific technique, but one can say that a specific category could be solved using a specific technique, however each problem has its own idea that problem solver should customize the technique used to solve the problem.

**Complete search**(Brute force)

Complete search exploits the brute force, straight-forward, try-them-all method of finding the answer. This method should almost always be the first algorithm/solution you consider. If this works within time and space constraints, then do it: it's easy to code and usually easy to debug. This means you'll have more time to work on all the hard problems, where brute force doesn't work quickly enough. In the case of a problem with only fewer than a couple million possibilities, iterate through each one of them, and see if the answer works.

**Divide and conquer**

A divide and conquer algorithm works by recursively breaking down a problem into two or more sub-problems of the same (or related) type, until these become simple enough to be solved directly. The solutions to the sub-problems are then combined to give a solution to the original problem.

This technique is the basis of efficient algorithms for all kinds of problems.

**Dynamic programming**

A DP is an algorithmic technique which is usually based on a recurrent formula and one (or some) starting states. A sub-solution of the problem is constructed from previously found ones. DP solutions have a polynomial complexity which assures a much faster running time than other techniques like backtracking, brute-force etc.

**Greedy technique**

A greedy algorithm is an algorithm that follows the problem solving heuristic of making the locally optimal choice at each stage with the hope of finding a global optimum. On some problems, a greedy strategy need not produce an optimal solution, but nonetheless a greedy heuristic may yield locally optimal solutions that approximate a global optimal solution (Be-Careful).

For example, a greedy strategy for the traveling salesman problem (which is of a high computational complexity) is the following heuristic: "At each stage visit an unvisited city nearest to the current city". This heuristic need not find a best solution but terminates in a reasonable number of steps; finding an optimal solution typically requires unreasonably many steps.

Continue to the next section

For example, a greedy strategy for the traveling salesman problem (which is of a high computational complexity) is the following heuristic: "At each stage visit an unvisited city nearest to the current city". This heuristic need not find a best solution but terminates in a reasonable number of steps; finding an optimal solution typically requires unreasonably many steps.

Continue to the next section

Very nice post se7so :) waiting for the next one :)

ReplyDelete