Where to practice your F# with fun?

Recently Scott Wlaschin blogged an excellent post “Twenty six low-risk ways to use F# at work” with guides and samples of how to improve your productivity at work with some of F# tips. But how to train your F# skills? Yes, the best answer is to read excellent F# books and contribute to open sourced F# projects. But it could be not so entertaining if you want to improve your language understanding and practice it in combat. (Nevertheless, contribution to open source is the most useful way)

I have tried to collect all (known to me) online competitions and problem archives that accept source code in F# or do not obligate you to sent source code to the server. This post is inspired by blog post from Alex Ivanovs “14 Coding Challenges to Help You Train Your Brain“.

Resources that support F# compilation.

HackerRank

Competitive programming challenges and contests across computer science.

HackerRank

SPOJ(Sphere Online Judge)

Sphere Online Judge – is a problemset archive, online judge and contest hosting service accepting solutions in many languages.

spoj

CODECHEF

CodeChef is a global programming community. They host contests, trainings and events for programmers around the world.

chef

 Hello World Open

helloWorldOpen

Language independent resources.

Sites that ask to solve problems and provide the answer but do not obligate you to send source code.

Project Euler

Project Euler is a series of challenging mathematical/computer programming problems that will require more than just mathematical insights to solve. Although mathematics will help you arrive at elegant and efficient methods, the use of a computer and programming skills will be required to solve most problems.

Euler

Rosalind

Rosalind is a platform for learning bioinformatics through problem solving.

rosalind

Google Code Jam

Google Code Jam is onсe a year competition. It is back in action challenging professional and student programmers around the globe to solve difficult algorithmic puzzles.

CodeJam_2014

Kaggle

Kaggle is a platform for predictive modelling and analytics competitions on which companies and researchers post their data and statisticians and data miners from all over the world compete to produce the best models.

kaggle

P.S. If you know other competitions or training sites that support F# please leave links in comments.

F#/.NET function minimization (optimization)

I have done some research on function minimization algorithms implemented on .NET. Short summary can be found below.

Gradient descent

Gradient descent is one of the simplest function optimization algorithms. You can implement it by yourself or using one of the following articles:

DotNumerics

DotNumerics is a Numerical Library for .NET. The library is written in pure C# and has more than 100,000 lines of code with the most advanced algorithms for Linear Algebra, Differential Equations and Optimization problems.

Unfortunately, dotNumerics does not have a detailed documentation. Let’s go through all minimization algorithms implemented in dotNumerics. First of all, we implement banana function from simplex method example available on the library site.

#r @"DotNumerics.dll"
open System
open DotNumerics.Optimization

//f(a,b) = 100*(b-a^2)^2 + (1-a)^2
let BananaFunction (x: float array) =
    100.0 * Math.Pow((x.[1] - x.[0] * x.[0]), 2.0) + Math.Pow((1.0 - x.[0]), 2.0)

Downhill Simplex

Downhill Simplex method of Nelder and Mead

The key advantage of Downhill Simplex method is that it does not require the gradient function. All you need is a function and an initial guess.

let initialGuess = [|0.1; 2.0|]

let simplexMin =
    let simplex = Simplex();
    simplex.ComputeMin(BananaFunction,initialGuess);

We have a bit of control over the evaluation model. We can restrict MaxFunEvaluations and specify custom Tolerance in Simplex model. In this case, model instantiation looks like below.

    let simplex = Simplex(MaxFunEvaluations=10000, Tolerance=1e-5);

Truncated Newton

“A Survey of Truncated-Newton Methods”, Journal of Computational and Applied Mathematics.

All other algorithms require gradient function to make calculation.


//f'a(a,b) = (100*(b-a^2)^2 + (1-a)^2)'a = 100*2*(b-a^2)*(-2a) - 2*(1-a)
//f'b(a,b) = (100*(b-a^2)^2 + (1-a)^2)'b = 100*2*(b-a^2)
let BananaFunctionGradient (x: float array) =
    [|100.0 * 2.0 * (x.[1] - x.[0] * x.[0]) * (-2.0 * x.[0]) - 2.0 * (1.0 - x.[0]);
      100.0 * 2.0 * (x.[1] - x.[0] * x.[0])|]

let newtonMin =
    let newton = TruncatedNewton()
    newton.ComputeMin(BananaFunction,BananaFunctionGradient,initialGuess);

Truncated Newton algorithm has three more configuration parameters than Downhill Simplex: Accuracy, MaximunStep and SearchSeverity.

L-BFGS-B

Limited memory Broyden–Fletcher–Goldfarb–Shanno method

let bfgsMin =
    let lbfgsb = L_BFGS_B()
    lbfgsb.ComputeMin(BananaFunction, BananaFunctionGradient, initialGuess);

L-BFGS-B has one more configuration parameters than Downhill Simplex – it is AccuracyFactor.

Results

Below you can find evaluation results received from models with default parameters.

Real: 00:00:00.024, CPU: 00:00:00.062, GC gen0: 0, gen1: 0, gen2: 0
val simplexMin : float [] = [|0.999999998; 0.9999999956|]
Real: 00:00:00.074, CPU: 00:00:00.078, GC gen0: 0, gen1: 0, gen2: 0
val newtonMin : float [] = [|0.9999999999; 0.9999999999|]
Real: 00:00:00.137, CPU: 00:00:00.140, GC gen0: 0, gen1: 0, gen2: 0
val bfgsMin : float [] = [|1.0; 1.0|]