Australia Day weekend in Halls Gap

Despite living in Victoria all of my life, I have never actually visited Halls Gap until recently. Booking a luxury residence for the Australia Day weekend (26th January), I headed up to the Grampians with my trusty Cannon.

The wild life was plentiful. Kookaburras, kangaroos and the occasional emu. The emus seemed very proficient at avoiding my camera lens.

The colours in the Australian bushland were one full display this weekend. We had spectacular weather, though it was on the warm side of things.

The only thing I am disappointed with is that it took me so many years to discover Halls Gap. I need to come back with more time available for exploring the abundance of hiking tracks and photographic opportunities.

Posted in leasure, photography | Comments Off on Australia Day weekend in Halls Gap

Masons Falls

Today’s weather in Melbourne was an absolutely gorgeous 29 degrees centigrade. So I thought I would hop in the car and take myself to Masons falls for a beautiful afternoon stroll. A work colleague had mentioned Masons falls to me a few weeks back. I head never heard of them. So it was time to check them out and take my camera.

 

 

Masons falls are a set of 45 metre high falls that run almost all year round and lead into a creek that eventually becomes Arthur’s Creek. The falls are just located outside of King Lake and once you have parked the car, a leisurely 600 metre walk will lead you to a viewing platform that will give you a wonderful panoramic view of the falls. The picnic facilities look excellent. I will have to come back another time when i am more organised and come fully equipped for a picnic. It would be  a beautiful location to just relax with family and friends over a BBQ.

I hadn’t explored much of the bush land in this area since the 2009 Black Saturday bush fires. Over the years it is amazing to see how well the natural bush land has recovered. From what

I have read about the area surrounding Masons falls. Apparently it has changed significantly since the fires. The loss of almost all vegetation led to significant soil erosion. As a result many  gullies have formed over the years since the fires. As the vegetation as re-established the rate of erosion has slowed down significantly and the landscape is definitely recovering.

I wish I had discovered this place many years ago. It is beautiful, very relaxing and I will definitely be back with my camera.

 

Posted in leasure, photography | Comments Off on Masons Falls

Bayes’ Theorem

Bayes’ Theorem describes the probability of an event, based on prior knowledge that may be related to that event. The theorem was named after Reverend Thomas Bayes, who put forward an equation that allows new evidence to update beliefs.

The Bayes’ Theorem is the basis of the Bayesian variety of Machine Learning.

Before we dive into Bayes’ Theorem, lets define some terms:

Posterior probability (often called posterior)
the statistical probability that a hypothesis is true calculated in the light of relevant observations.

Prior Probaility (often called prior)
the probability that an event will reflect established beliefs about an event before the arrival of new evidence or information.

In general terms, Bayes’ Theorem states that we have a prior probability to begin with and as a result of running a test we obtain some new evidence. The prior probability combined with the new evidence results in a posterior probability.

prior probability + evidence -> posterior probability

Lets say we know that the prior probability of event H occurring is 2%. The prior belief about H is written as P(H) to stand for “the probability of H”.

P(H) = 0.02 = 2%

To make this example more concrete lets say that P(H) is the probability of a particular disease occurring in a population.

We now look for evidence (E). Bayes’ Theorem answers precisely what is the revised (posterior) belief about H given the evidence (E). This is written as P(H | E) meaning “the probability of the hypothesis H given the evidence E”.

So in our case Bayes’ Theorem would answer what is the probability of having the disease given that the test shows the patient is positive for the disease.

In order to be able to answer this we also need to know what is our test sensitivity P(E | H). That is, what is the chance of having a positive test given I have the disease. Lets say for the example that P(E | H) is 0.9 or 90%.

Give this we can then provide the posterior by saying

P(H | E) = ( P(H) . P(E | H) ) / P(E)

The denominator in this formula, P(E), is the probability of the evidence irrespective of our knowledge about H. Since H can be either positive or negative, it is also the case that

P(E) = P(E | H) * P(H) + P(E | not H) * P(not H)

So the full Bayes’ Theorem is:

P(H | E) = ( P(H) . P(E | H) ) / ( P(E | H) * P(H) + P(E | not H) * P(not H) )

For our example we know that P(H) is 0.02. We also know that P(E | H) = 0.9.
In our example suppose we start with P(H)=0.4, then, since we know P(E | H) = 0.1 it follows that the numerator, P(H) . P(E | H), is 0.018.

This leaves term P(E | not H). The probability of getting a positive test result given I do not have the disease. It is reasonable to assume this probability is equal to 0.1.

Hence the denominator is equal to 0.9*0.02 + 0.1*0.98 which is 0.116. Since the numerator was 0.018 we conclude finally that P(H | E) is equal to 0.018 divided by 0.116, which is 0.1552.

So from our starting belief that the probability of a disease being 2%, once we know that a positive test has been returned for the patient we can now revise our belief, P(H | E), equal to 15.5%. The evidence has clearly had a significant impact on out belief.

Posted in Machine Learning, Programming | Comments Off on Bayes’ Theorem

What is machine learning?

Machine learning is a subcategory of artificial intelligence.

Machine learning applies statistical techniques to learn from a set of examples. It is all about learning from example.

The most common form of machine learning is supervised machine learning. Supervised learning is the machine learning task of inferring a function from labeled training data.

The training data consists of a set of training example that need to be understood really well. The data also needs to be of high quality.

The accuracy of the results is then confirmed by validating the machine learning system with previously unseen data whose results are also well under stood. It is critical to understand the accuracy of a machine learning system in order appreciate the quality of the results produced when presented with previously unseen data.

In machine learning we provide features (or input) of a data set to an algorithm in order to identify labels (output).

For example:

Let say we wanted to apply machine learning algorithm to categorise animals. The features may be such things as the number of legs, if the animal has a tail, or if the animal has antlers. The labels that we desire as output could be the categorisation of the species such as Mammal, Bird, Reptile, Fish, Amphibian, Bug or an Invertebrate.

Through the application of a machine learning algorithm we try to define what is called a decisions surface. A decision surface is the boundary between labels given a set of features.

For example: Linear Decision Surface

Decision Surface
For a previously unseen piece of data where “feature A” and “feature B” are applicable, we should be able to categorise it into either “label 1″ or label 2” with a high degree of certainty based on the identified decision surface if the accuracy of the machine learning algorithm is sufficiently high and has been based on a large enough training data set.

So in summary, a machine learning algorithm takes a known set of data and transforms it to produce a decision surface so that all future cases can be classified.

Posted in Machine Learning, Programming | Tagged | Comments Off on What is machine learning?

Procs, Blocks and Lambdas

I was contemplating writing my own piece on the difference between ruby Blocks, Procs And Lambdas in an attempt to solidify my understanding. However, I found these two excellent articles that do a much better job than I could possibly do. So I thought I would post them here together for my own reference more than anything.

Ruby Procs and the Difference Between Them by Alan Skorkin

Jessica Kerr’s explanation is particularly interesting since she looks at the pitfalls of approaching ruby with a purest functional mind set. Specifically treating a block as a function.

Passing functions in Ruby: harder than it looks by Jessica Kerr

Posted in Programming | Tagged , | Comments Off on Procs, Blocks and Lambdas

Programming Language Trends

I stumbled upon Programming Language Trends by Drew Conway by accident today.

The end result is a funky visualisation of the popularity of programming languages by looking at their usage on Github and StackOverflow.

I found it interesting and thought others may also. Enjoy.

Posted in Programming | Comments Off on Programming Language Trends

The Ride to Conquer Cancer 2013

In 2012, I was diagnosed with Melanoma. Melanoma is very dangerous if not found early. Fortunately mine was found early and was a Clark Level I Melanoma. I since made a full recovery.

Cancer is something has touched many of my friends and family. My owne experience has spawned me on to try and make a difference. In 2012 I committed to The Ride To Conquer Cancer fund raiser. It was incredibly satisfying raising the funds and facing the challenge of the two day, 200km ride from Melbourne to Healesville.

This year my son and I have signed up to for The Ride To Conquer Cancer again. We will be cycling over 200km from Melbourne and through the Mornington Peninsula on the 26-27 October, 2013.

If you are like me and would like to see this horrible disease conquered, sponsor either my son or my self on this epic ride.

You can donate to either one of us by clicking on one of the below links….

Donate to Andy’s Epic ride

Donate to Joby’s Epic ride

Posted in Riding | Comments Off on The Ride to Conquer Cancer 2013