Friday, November 27, 2009

DNS resolving to 1.0.0.0

I've had the most peculiar thing happened today at home. I think it's Karmic issue since I never had this problem before.

Well what happened was, I was unable to browse the internet (Firefox and Chrome both had problems but Opera worked fine). Pinging worked. Other Windows machines didn't have problems at all. Sudo apt-get update didn't worked because it was looking up 1.0.0.0 for every mirrors I tried. I tried this with Firefox at first:
In address bar, type in about:config, filter for ipv6. There should be only 1 listed - network.dns.disableIPv6. Double click to change the value to true. Problems solved for Firefox.

I continued googling around and did this:

#in /etc/resolv.conf, changed nameserver to those of streamyx
#previously it was 192.168.1.1 which points to my router
#Generated by NetworkManager
nameserver 202.188.0.133



And now everything works again. Only that if I reboot it will change back to 192.168.1.1

Putting MySurfGuard to test

So I had to run ruby script to to assess how good MySurfGuard is (it's basically a package of DansGuardian, Webmin and Squid). I had 5000++ over URLs in a  text file supposed to be porn sites with a remark whether it is a porn site or not verified by human. I got it from Untangle website.They tested the scripts on some other apps so I've done some minor changes for DansGuardian (mysurfguard-test.rb).

  1. Install MySurGuard.

  2. I kinda cheated a little bit - took the updated blacklist from here and phraselist from here so MySurfGuard has the latest copy.

  3. Install ruby and lynx and wget (the script needs them) and as I was working on CentOS, so it's


  4. $ yum install ruby ruby-devel lynx wget



  5. Install lynx wget (the script requires them)

  6. Set configuration file for lynx to use proxy, put in your proxy address and port


  7. #in /etc/lynx-site.fg
    PROTOCOL_proxy="http://localhost:8080/"



  8. For wget to use proxy, do


  9. $ export http_proxy="http://localhost:8080"



  10. Run the script


  11. $ ./mysurfguard-test.rb results-human.txt results-mysurfguard.txt




You'll have the result in results-mysurfguard.txt

Tuesday, November 24, 2009

I love my charmbracelet

Makes me want to get another one. That's going to cost me another RM100. I don't care! I want another one, and another one, and another one!

[caption id="attachment_2044" align="aligncenter" width="300" caption="Pretty pretty thing!"]

Media_httpblogcawanpi_qhjng

[/caption]

Wednesday, November 18, 2009

Fuzzy reasoning & Artificial Neural Network

Interesting assignment for my AI paper. The first one asks for me to calculate someone's learning difficulty level, given his IQ level and his recent test score (education domain). Another one asks to predict the level of H1N1 risk of a patient, given his severity of these symptoms: fever, breathing difficulty, fatique and coughing.

Fuzzy reasoning is useful when you're dealing with 'high IQ', 'average IQ', 'low IQ', etc. How would you translate 'high' or 'average' into crisp numbers like more than 160 it's 'high IQ'. If it's more that 130cm it's 'average IQ'. Of course the numbers can't be used for everyone (I may have a different opinion on how much is 'high IQ'). And you can't use simple algorithm if and else like we normally would do. Fuzzy sets on the other hand allows you to have 160 falls under 'high IQ' and 'average IQ' at say, 0.8 and 0.6 (on a scale 0 to 1) each. Once you have defined all that, you have to have your fuzzy rules to work on. Given someone's IQ level and his test score, you can calculate his learning difficulty level and then maybe you can have this result to propose a new set of tests that match his level.

Artificial neural network emulates how the brain works, well sort of. It has the capability to learn and predict stuff. Like the H1N1 case, the model will learn to reach a very good predict whether the patient has high risk of being infected with H1N1 or not. It is much dependant on number of cases, the more cases you feed the model, the better the result will be.

Personally I think it's all mathematics. Everywhere you see calculations. Once you understand how it works, it's pretty easy (the data I worked on is not that much hahaha). In real world, the data set is huge, so data mining is actually intriguing, I think. Worth knowing.

In the risk of my classmates meniru, I'm going to upload what I have done so far, which I doubt that they'll find it here until much later. If you manage to get it here anyway, hey I really don't mind you looking around. Just be kind enough to give feedbacks there's mistakes in the doc :P.

Okay. HUNGRY. AND SLEEPY.

Monday, November 16, 2009

Charles and Keith, I love you!

[1] Even though they have been around for over a decade, they only opened their first store in Malaysia last year. The shoes are oh so pwetty! I'm so happy that they have finally made to our shores here. Took 1 pair home with me last Saturday http://www.charleskeith.com/

[caption id="attachment_2027" align="aligncenter" width="421" caption="Would you look at that.. I'm melting"]

Media_httpblogcawanpi_cjyde

[/caption]

[caption id="attachment_2029" align="aligncenter" width="412" caption="A very sexy pair for work"]

Media_httpblogcawanpi_jiglb

[/caption]

[caption id="attachment_2028" align="aligncenter" width="413" caption="Awww mummy wants"]

Media_httpblogcawanpi_jfecp

[/caption]

[2] I finally got myself a charmbracelet! Yessss!

Thursday, November 12, 2009

Beef burger

[caption id="attachment_2018" align="aligncenter" width="502" caption="Homemade beef burger"]

Media_httpblogcawanpi_fikfx

[/caption]

Dead easy to make. Cooking is easy and quick if you plan your way around it. I made alot of patties for quick snacks at night.

The recipe?

Friday, November 6, 2009

Some pics on 24 Hour OSS Webdev Competition

We provided them food, unlimited coffee, a pc with Ubuntu installed, 2 wired connections, 2 wireless connection, a huge desk and enough chairs and couch. The rest, gadgets, devices, books, cables, wires, they brought it themselves.

[caption id="attachment_2006" align="aligncenter" width="300" caption="Stopwatch and huge clock on screen. It would start at 11am. Arm did this - "fuyyo" was my first reaction when I saw it the first time."]

Media_httpblogcawanpi_rzpba

[/caption]

Tuesday, November 3, 2009

Can you guess what kind of drink is that?

[caption id="attachment_1998" align="aligncenter" width="225" caption="Magic concoction"]

Media_httpblogcawanpi_ghrta

[/caption]


Last time when I was so sick that I kept on throwing up, this actually helped it to stop (as in it stopped the puking part, but I still got admitted to hospital for a few days). Also, if you plan on having looooong day or looooooong night - I'd recommend this. I'm loving it.

Html Validator on FF3.5 on Karmic

Note to self:

[1] Add PPA https://launchpad.net/~bdrung/+archive/ppa
[2] sudo apt-get update
[3] sudo apt-get source htmlvalidator
[4] change 3.1.* to 3.5.* (in xpi/install.rdf.in, change maxVersion 3.1.* to 3.5.*)
[5] Install some build dependencies - libxul-dev
[6] in htmlvalidator directory, ./configure and make
[7] open up just created .xpi in FF, install

http://users.skynet.be/mgueury/mozilla/index.html