grep – finding patterns in your log files

grep is a very useful utility when working with the Linux file system.

It is one of many common Linux tools that support regular expressions such as vim, sed and awk, allowing you to home in on results of a particular string pattern contained within a file.

However even searching for simple patterns can prove to be very useful too. So don’t let the term “regular expression” put you off making use of it.

The syntax can be as simple as: grep patterntomatch filename

Here are some simple examples that I find particularly useful/interesting.

Having a look at the web server access logs for file types I know I don’t use, but might indicate signs of possible attack.

/var/log/apache2# grep .php access.log
104.224.15.126 – – [17/Dec/2014:02:22:53 +0400] “GET /dddd/ddd/dd.php HTTP/1.1” 404 527 “-” “-”
104.224.15.126 – – [17/Dec/2014:02:22:54 +0400] “GET /phpMyAdmin/scripts/setup.php HTTP/1.1” 404 527 “-” “-”
104.224.15.126 – – [17/Dec/2014:02:22:54 +0400] “GET /pma/scripts/setup.php HTTP/1.1” 404 527 “-” “-”
104.224.15.126 – – [17/Dec/2014:02:22:55 +0400] “GET /myadmin/scripts/setup.php HTTP/1.1” 404 527 “-” “-”
195.154.42.218 – – [17/Dec/2014:08:39:44 +0400] “GET /rgrg/rgr/rg.php HTTP/1.1” 404 527 “-” “-”
195.154.42.218 – – [17/Dec/2014:08:39:44 +0400] “GET /phpMyAdmin/scripts/setup.php HTTP/1.1” 404 527 “-” “-”
195.154.42.218 – – [17/Dec/2014:08:39:44 +0400] “GET /pma/scripts/setup.php HTTP/1.1” 404 527 “-” “-”
195.154.42.218 – – [17/Dec/2014:08:39:44 +0400] “GET /myadmin/scripts/setup.php HTTP/1.1” 404 527 “-” “-“

Above we can see a few results indicating that likely automated attempts are being made for detecting mis-configured or vulnerable versions of phpMyAdmin.

Again taking a look at the server’s logs for instances of requests for content that doesn’t exist can provide an interesting picture. Normal user activity will rarely include 404 errors if your content is broken link free, even if not, as a site administrator you should have a decent idea of what constitutes an unusual pattern and could be used anyway to identify/fix broken links in any case 🙂

/var/log/apache2# grep -v robots.txt *| grep 404

21.41.58.199 – – [15/Dec/2014:02:57:32 +0400] “GET /bvbv/bvb/bv.php HTTP/1.1” 404 527 “-” “-”
121.41.58.199 – – [15/Dec/2014:02:57:33 +0400] “GET /phpMyAdmin/scripts/setup.php HTTP/1.1” 404 527 “-” “-”
121.41.58.199 – – [15/Dec/2014:02:57:34 +0400] “GET /pma/scripts/setup.php HTTP/1.1” 404 527 “-” “-”
121.41.58.199 – – [15/Dec/2014:02:57:34 +0400] “GET /myadmin/scripts/setup.php HTTP/1.1” 404 527 “-” “-”
66.249.75.24 – – [15/Dec/2014:03:17:04 +0400] “GET /rvmgthgqv.html HTTP/1.1” 404 546 “-” “Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)”
61.160.247.7 – – [15/Dec/2014:09:42:46 +0400] “GET /manager/html HTTP/1.1″ 404 508 “-” “Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; .NET CLR 1.1.4322; .NET CLR 2.0.50727; .NET4.0C; .NET4.0E; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729)”

The -v robots.txt has been used to remove all of the requests that include robots.txt as I know these will be in there and I don’t want to see them as I expect a 404 for any of these requests.

I have highlighted above an attempt to detect a Tomcat login page. Misconfigured Tomcat servers are often responsible for compromises in many environments due to the ability to deploy your own code if successfully compromised, extra win if configured to run as system or root as can sometimes be the case.

See the related metasploit modules for more info on these attacks.

Suppose your interested in determining where the IP that made the suspicious connection to your server is located:

whois 61.160.247.7 | grep country | sort -u
country: CN

Combining whois to lookup the IP record information, filtering that output with grep to return only lines with country in the response and performing a unique sort to get rid of the duplicates we can see its our friends from china who are apparently paying us a visit.

Particular flags I find most useful when combined with grep:

-v — Invert search, i.e. patterns that don’t match

-E — Regular expression

-i — Ignore case

-l — Print the name of the file that matches instead of the whole output

-n — include line number

Vim – VI improved

I think it was around 1998, that I was introduced to Linux for the very first time.

I am pretty sure it was SuSE 6.2, I had a college class at the time that involved installing two operating systems and making some pretty basic system configuration changes to secure a pass for the class.

Learning about various commands and tools like YAST, and thinking how funny are these Linux guys. Yet another setup tool, still makes me chuckle to myself.

I had really only ever installed Win 95 on the old Pentium MMX 200 at home and was really pleased to be learning something new, that hopefully might give me some skills to get a job.

We only had to install Windows 3.1 and SuSE in our groups of 3 and then make some changes and record the steps.

There wasn’t the hardware for a physical machine each and the idea of  virtual machines was certainly not something we had ever discussed. Not like now, when spinning up a VM is as normal a task as sending an email.

I recall, being intrigued at the time by vi.

VIM(1) VIM(1)

NAME
vim – Vi IMproved, a programmers text editor

I was amazed that people actually worked with such a tool, having to use various keys to navigate the text and insert, copy, paste etc. I could imagine that servers over the Internet with only shell access would all be configured in such a way.

Not realizing at the time how powerful an editor it actual was. I was struggling to just do some basic tasks with it.

Some 16 years later. I am still amazed at vi, I have become more comfortable with it, the more you use it, the easier it becomes, although I do still refer at times to my cheat sheets 😉

These days vi has been improved to vim, with various excellent new features. Using vi, is actually usually using vim.

Some very basics to get you going:

create a file:

vi filename

We insert some text by leaving command mode and entering INSERT mode i:

i

~
~
~
~
~
~
— INSERT — 0,1 All

We can then type in some text such as:

Monday Tuesday Wednesday Thursday Friday

We want to save, but can not do this in insert mode, so we need to return to command mode, we press esc key to do this. I sometimes just press it a couple of times, just to be sure I am not about to start inserting colons into my text, however you do only need to press it 1 time.

We then save the changes by entering :w (don’t forget the colon)

Monday
Tuesday
Wednesday
Thursday
Friday
~
~
“filename” [New] 5L, 41C written 5,1 All

We can do things like remove lines of text by using dd.

Place your cursor at the first line of Monday. Command mode gg will take you there quickly.

Press dd to remove the first line:

Tuesday
Wednesday
Thursday
Friday
~
~
1,1 All

If you’ve  made a mistake or decided you like Mondays you can press (undo)

Monday
Tuesday
Wednesday
Thursday
Friday
~
~
~
1,1 All

We can search for strings within the file using /string

Monday
Tuesday
Wednesday
Thursday
Friday
~
~
~
/day 5,4 All

The cursor will move to the first instance of the string from were the position of the cursor is at the time you start the search. I suggest starting at the first line with gg then /day

We can replace all strings quickly that match a particular pattern, which is pretty useful feature:

Monday
Tuesday
Wednesday
Thursday
Friday
~
~
~
:%s/day/day evening/g

Changing the strings to the best time of any week day:

Monday evening
Tuesday evening
Wednesday evening
Thursday evening
Friday evening
~
~
~
5 substitutions on 5 lines 5,1 All

For more info and further examples of vim:

Monday evening
Tuesday evening
Wednesday evening
Thursday evening
Friday evening
~
~
:q!

To quit back to the command line.

And check out the vimtutor:

#man vimtutor

VIMTUTOR(1) VIMTUTOR(1)

NAME
vimtutor – the Vim tutor

SYNOPSIS
vimtutor [-g] [language]

DESCRIPTION
Vimtutor starts the Vim tutor. It copies the tutor file first, so that it can be modified without changing the original file.

The Vimtutor is useful for people that want to learn their first Vim commands.

Tagged

Linux Man-Pages

Linux is great, but at times you can become a bit frustrated. Especially if you cant remember what you are doing at the command line or you don’t work with it that often.

For example:

You find yourself sitting at the command prompt and you want to make some changes to your Apache web server, but you have not done the task very often or it’s just been a while and as you’re getting a bit older, your mobile phone likely has greater memory than you.

You reach for the favorite search engine … WAIT.

All you should need is 1 command. After all this is Linux.

# man man

MAN(1) Manual pager utils MAN(1)

NAME
man – an interface to the on-line reference manuals

*** Snipped

For example, suppose I know I want to do something with Apache, you can search using keywords with the -k switch:

# man -k apache

a2dismod (8) – enable or disable an apache2 module
a2dissite (8) – enable or disable an apache2 site / virtual host
a2enmod (8) – enable or disable an apache2 module
a2ensite (8) – enable or disable an apache2 site / virtual host
ab (1) – Apache HTTP server benchmarking tool
apache2 (8) – Apache Hypertext Transfer Protocol Server
apache2ctl (8) – Apache HTTP server control interface
apachectl (8) – Apache HTTP server control interface
check_forensic (8) – tool to extract mod_log_forensic output from apache log files
DBI::ProfileDumper::Apache (3pm) – capture DBI profiling data from Apache/mod_perl
logresolve (1) – Resolve IP-addresses to hostnames in Apache log files
rotatelogs (8) – Piped logging program to rotate Apache logs

Ok, we can see the list of commands related to Apache. We can read the desciption and maybe that is enough of a reminder. But what do the numbers mean?

Ask man:

# man man-pages

**Snipped

Sections of the Manual Pages
The manual Sections are traditionally defined as follows:

1 Commands (Programs)
Those commands that can be executed by the user from within a shell.

2 System calls
Those functions which must be performed by the kernel.

3 Library calls
Most of the libc functions.

4 Special files (devices)
Files found in /dev.

5 File formats and conventions
The format for /etc/passwd and other human-readable files.

6 Games

7 Conventions and miscellaneous
Overviews of various topics, conventions and protocols, character set standards, and miscellaneous other things.

8 System management commands
Commands like mount(8), many of which only root can execute.

 

Turns out the numbers are sections of the man-pages. So this information might help you further to decide if the one you suspect is actually the command you are looking for. Great

Then you can dig deeper into the command, reading all of the useful information or just paying attention to the specific areas of interest:

# man a2dismod

***Snipped

DESCRIPTION
This manual page documents briefly the a2enmod and a2dismod commands.

a2enmod is a script that enables the specified module within the apache2 configuration. It does this by creating symlinks within /etc/apache2/mods-enabled. Likewise, a2dismod disables a module by
removing those symlinks. It is not an error to enable a module which is already enabled, or to disable one which is already disabled.

Note that many modules have, in addition to a .load file, an associated .conf file. Enabling the module puts the configuration directives in the .conf file as directives into the main server con‐
text of apache2

EXAMPLES
a2enmod imagemap
a2dismod mime_magic

Enables the mod_imagemap module, and disables the mod_mime_magic module.

Tagged ,

Sqlmap – HTTP POST Request File

After some reading of http://carnal0wnage.attackresearch.com/2011/03/sqlmap-with-post-requests.html and thinking about how I normally try to deal with post requests, thought I would jot down a few lines as a reminder.

Using a HTTP Request File. You can capture this of course using a proxy or firefox addon quite easily.

sqlmap.py -r filename.txt –level 1 –risk 1 –dbms mysql -p paramatertotest –proxy http://127.0.0.1:8080

** level and risk can be adjusted if SQLmap doesnt confirm there is an injection, but you believe there is. 5 and 3 are the max respectively.

sqlmap.py -r filename.txt — dbms mysql –proxy http://127.0.0.1:8080 –current-db

**Obtain Database Name

sqlmap.py -r filename.txt — dbms mysql –proxy http://127.0.0.1:8080 -D dbname –tables

**Obtain Table Names

sqlmap.py -r filename.txt — dbms mysql –proxy http://127.0.0.1:8080 -D dbname -T tablename –columns

**Obtain Column Names

sqlmap.py -r filename.txt — dbms mysql –proxy http://127.0.0.1:8080 -D dbname -T tablename -C col1,col2,col3 –dump

**Obtain Data from the columns specified

Might want to specifiy a particular technique:

–technique BEUS

** Subtract letters to remove type from test.

B: Boolean-based blind

E: Error-based

U: Union

S: Stacked queries

T: Time-based blind

Some other interesting details:

sqlmap.py -r filename.txt — dbms mysql –proxy http://127.0.0.1:8080 –current-user

** Current DBMS username

sqlmap.py -r filename.txt — dbms mysql –proxy http://127.0.0.1:8080 –current-is-dba

** Is the user a DBA?

sqlmap.py -r filename.txt — dbms mysql –proxy http://127.0.0.1:8080 –file-read=Path

** Read a file from the path provided.

Full documentation: http://sqlmap.sourceforge.net/doc/README.pdf

Preventing SQL Injection: https://www.owasp.org/index.php/SQL_Injection

 

*Recent edit to update the -r flag. for raw request.

Ghost In The Wires – Kevin Mitnick – Book Review

It took me a while to finally get through Kevin Mitnick’s latest book – Ghost In The Wires. Though, this was not due to it not being a page turner. Fact is it was!

It took me what felt like so long, as I had been trying to squeeze in the reading time in between everything else. Every time I turn around there is something else to learn or look at elsewhere. Still, beats being bored so can’t complain!

If you had never heard anything about Kevin or other hackers of that time you might find this video useful for a little background

The book is very interesting from a few perspectives. There is the personal component and the security/technology component.

The personal component gives you an idea of the way Kevin had to live his life, the impact on his family and a sense of a world in which trust was at times very scarce. It is hard to imagine in some ways, especially around how tough it must have been inside for so long or the fear I would imagine at facing a very long time locked away.

I enjoyed gaining an insight into the thinking behind some of the attacks or techniques he used when gaining access to a variety of systems (the list is quite astounding) or when he was trying to avoid capture. Surprised often by what, when you read it seems like simplicity, you almost think that would never work, but it did!

For anyone who is not that technical, you won’t have to worry, the details are not to heavy and the book does a decent job I think of explaining anything that is even the slightest bit technical in way anyone one familiar with using computers and the Internet should be able to understand.

From a security perspective it should be an alarm bell to many at the very least. Social Engineering, is alive and kicking. It is also something that many don’t seem to have enough awareness of, still to this day.

Matters in fact often appear to be getting worse as everyone rushes to share all of their intimate personal details all over the place, almost like trusting every stranger is now a good idea.

When you think about the kind of information and access he had, many times by simply asking for it, in others situations by doing enough research on targets and gathering enough relevant information to be extremely convincing and exploiting that element that companies pride themselves on with their staff, helpfulness.

I cant help but wonder how would many local businesses fair today. Especially as I see them rush out to splatter everything they can online in an effort to boost their page ratings. Often I am guessing without a thought about the potential risks.

The book mentions a documentary that I really need to check out called Freedom Downtime

Perhaps another day.

Coliseumlab – Observations

Back in March I wrote a post about my beta testing of http://Coliseumlab.com Elearnsecurity’s latest project.

Well the labs are live and the first new students have arrived in the forums and are experiencing the fresh new design and interface, wow what a difference from the BETA.

I personally was eager to test the new lab simulations that had just been released, to date there are currently 14 live, with I believe more in the pipeline.

The designs are great and yet simple enough, you’re not overwhelmed as you are trying to learn more about the concepts/techniques of particular attacks. You would probably benefit from some prior exposure to tools like burp, dirbuster, sqlmap & firebug to really get the most of your time. Though the support through the forums will assist anyone who feels the need to ask for more information.

I don’t think I need to go into great detail about the actual attacks in this post, you can get those from the link above, instead I wanted to note a few points that I personally felt benefited my  own studies by using my time in the environment.

My observations of any gains I’ve made:

Well I first feel much more confident in using some tools. The more hands on practice you get with software tends to have that affect.

I’m much more serious about taking useful notes. Well worth the effort and something to maintain/improve on into the future.

Saving you time that could be easily wasted by searching the net. Searching the net or through books is not always a shortcut.

Sure it’s easy to search the net, it is also too often easy to get distracted with all the extra content being thrown your way.

Books? Well is that not too far away from your desk.

Spent some time looking at other aspects I was curious about, when given a learning plan for a lab, I like to think “what else I can learn from this?”

This was good; I ended up feeling compelled and motivated to write a ruby script that helps me on a particular Joomla information gathering task.  Thanks to Digitalwestie and Matugm for hints in the right directions, I know you guys are busy with your own stuff. So I do appreciate people who take time out to give a few pointers.

There may be other tools doing the same job, but being able to solve your own problem has its advantages and again keeps me away from the distraction of searching. I also happen to be reading about ruby at the moment, so a chance to get away from the usual puts “hello world” stuff and try to develop these skills somewhat was fantastic!

Trophies gathered.

Tagged , , ,

Social Engineering The Art of Human Hacking – Review

So I have just completed Chris Hadnagy’s book and now I am a social engineering master. Well perhaps a master of this art is an exaggeration on my part, but I certainly believe I have learned a great deal from reading what is in my view, an essential guide to the inner workings of social engineering, be it used for good or evil.

The book does not claim to turn anyone into a master. It does though give you a broad and deep understanding and will point you to many other areas of research if becoming a master is your end goal. Considering the years of research gone into various disciplines discussed and skills you would need to cover, I wish you happy researching if this is your end game. I certainly plan to return to many of the areas out of interest myself.

As the web has recently started to develop into the social monster we see today, its teachings I believe may become even more important to many of us in the years to come. It should certainly convince people that a good security awareness program must be adopted everywhere and continually tested/updated.

Just observing my students and others I know online; day to day and seeing the kind of information they share without a thought of the impact this may have for their own personal security, never mind the organisations they may work for, really makes me think it could become open season for crime, in many different ways.

Hopefully they are all lucky and don’t fall victims or perhaps they listen to my constant warnings to take more care.

Thinking back to my own perception of social engineering before I read the book, I had a good idea of what I considered relatively simple types of attack, unfortunately though many people still seem to fall for these sweeping attacks.

For many of us experienced web users we tend to spot these or our spam filters sweep them away so we don’t have to endure yet another delete button press. But what about a targeted attack? How many of us would fall victim then?

I imagine a very high proportion of people would. In fact given enough information about the target and the right set of circumstances we all could quite easily and if you think “no way, not I” then you are probably the most likely to fall for one.

The book outlines the lengths that individuals or groups resort to, in order to tailor an attack customised especially just for you.

Essentially gathering your information from just about any resource they can, coming into contact with you in person or others around you, reading your face, emotions and behaviours like a book and then using all of this minute detail to manipulate you into giving further information away or perhaps fully compromising your systems in a variety of different ways including sending malicious files, dropping off at your office CD/DVD or USB devices with more nasty stuff, convincing you to browse to nasty websites or stealing your systems from right under your nose!

In actual fact it could be quite scary reading for many.

The book also offers good advice in terms of what you can do about it all to what to look for in an auditor if you have already started to think how these attacks my affect your business and would like to test/improve your performance.

The book promotes something which I truly think is important “be aware, educated and prepared”

I have heard recently there is “no patch for human stupidity”, well there is no immediate fix but we can certainly receive constant updates: Through Education.

Tagged , , , ,

Glasgow Based – Computer – Software Repair – Training Stow College

Recent developments I am working on include a brand new computer repair training course in Glasgow aimed at home and small business in the area of computer repair solutions.

The purpose to assist you with the ongoing issue of break/fix type repair costs that many face on a regular basis. Often as a result of a virus or known just as malware for example.

We’ve all had those problems. Even with software designed to fend off these issues.

In these tough economic times making every penny count is vital.

Can you afford to pay out for expensive technical support?

If not this is the one for you.

This course which is run over 6 sessions introduces exactly what you need to know in order to do your own repairs from:

Sourcing your parts and software, getting to grips with the tools and techniques you need to recover your systems and keep them running.

To then ending by discussing the ever growing risks of the security issues we all face as users of the modern internet.

A hands on approach keeping the technical jargon to a minimum so you gain the practical skills to complete your goal of getting the system working again like the first time you turned it on and hopefully armed with a better idea of how to protect yourself and keep your system the way you want it.

There are options for funding available. Feel free to contact Stow for more information.

More details

Next course running soon. So don’t delay to get your place.

Supported by materials available online.

You should be comfortable using your computer system prior to joining this course to get the most out of this training.

Tagged

Elearnsecuity BETA Testing of new labs

Elearn Security Online Labs and Challenges BETA Review

I was lucky enough to be asked to join in with a small team of beta testers to experience firsthand the latest developments by Elearnsecurity. The new online labs that are dedicated sandboxed cloud based environment for each student, accessible from the main learning material interface/portal.

On the 29 March 2011 at 8PM GMT; I eagerly await access to get a look at the concepts and ideas that up to then had only read about in discussions and suggestions by Armando in the forums.

The whole process I should add came off the back of consultation with the existing student community, so good to see a provider listening to the demands of the customer.

Reading the text in the forums, I could tell that Armando was excited about this project as it would bring yet another perspective to the learning model on offer from Elearnsecurity and ultimately would enhance the learner’s experience.

My experience I would have to say lived up to anticipated hype!

Perhaps partly due to having another chance to log in with equal minded folks keen to get their hands on the latest information and experience they could.

I need to mention, matugm who pretty much pwnd it before we really got off and running, last time I seen him posting he was trying to get a shell!

4 hours flashed passed in a blink of an eye.

Roughly about 30 mins after the agreed meet up time; we were off and running. The only real reason for this late start in my mind was due to getting some testers initial technical issues resolved and giving us group instruction via text based chat. (A head ache I would imagine if you’re on the supporting side)

The test itself involved what appeared to be at the outset a simple enough challenge for anyone who has experience in this field or perhaps has recently studied a course like ecppt.

Though as we attacked the challenge set out in front of us, I started to realise how this setup could give me an advantage to expand my inner working knowledge of both SQLi and of the popular tool sqlmap.

Most of what I have read or the guides I have seen up to now focus in on GET method via URL vulnerable parameters. I struggled to see many discussions around other input channels or POST method choices while studying for ecppt.

This challenge was exactly what I wanted to see, exactly what I wanted to try out and lucky for me gave me a chance to yet again apply what I had learnt up to know. Having developed new skills you are keen to try to keep them sharp, in hacking as we know only really achievable if you have labs if you want to stay on the side of the good.

As well as the actual objective of retrieving information from a database and inserting data into a database, I found I was starting to think “you know what I am going to try this from different angles from the point of view of learning more about the use of Sqlmap, from a burp log, from config file and straight off the CLI”. Giving me a chance to work with the tool in the different ways I wanted, so I could feel I was making progress towards being comfortable with the options that up to now I had really only read from the manual.

I collected a whole bunch of data that I plan to look at from the tool and study/research a bit further to understand as much as I can from the SQL used and look to ways to refine  and focus my attacks.

All of this from a single login page, I could hardly believe it, now I was wishing I had more time and energy left in me to keep chatting and testing with some very experienced specialists in the field. Alas work beckoned in the morning and the lure of sleep forced me to concede, well at least to go to bed and leave my laptop cracking the hashes I had obtained from the dump (not actually an objective but fun none the less, there was talk of removing this as I don’t think it was intended but I suggest leave it in)

As far as other enhancements are concerned; my understanding is there is a plan to have more exercises with step by step walkthrough examples to get those needing extra assistance off the ground, which in turn will help them pass the cert/gain skills.

More challenges are to be released to keep the more experienced students coming back for more; this in turn should in my view build on what is already showing signs of growing as a community.

The challenges do come with hints, but I know from chat with Armando and the team that these hints will affect the participants scores and plans of giving away goodies for challenge winners will probably keep the hardcore away from these but still give others at least a good learning tool until they build up their skills and experience.

Details of the goodies and prices for access have yet to be disclosed.

In my humble view a good addition to give us who are interested in researching security the hands on approach that is needed to close in on a sector that needs more good guys/gals on side.

Lets face it so far the bad guys/gals appear to be winning.

eCPPT – Review and Passed

I am pleased to be able to report I am now a proud holder of eCPPT.

From my own background and perspective the course and exam was a very enjoyable experience. I would recommend this to anyone interested in security and perhaps on a limited budget.

I had done CEH prior to this course and personally found CEH useful in giving me a good foundation to approach this course. My day-to-day working life is not at the moment centred on security.

From my initial contact with eLearn security, I was impressed by the way I was handled as a potential customer and supported in terms of believing that I could achieve.

I did ponder long and hard before I parted with my own hard-earned cash.

After making my decision to join the course, I initially did feel a bit unsure in what I had bought into, mainly due to my concerns that perhaps I could not do this on my own in a distance learning fashion.

My fears where quickly put to rest, once I seen responses to my questions and I had read every post in the forums to make sure I was not adding posts already answered and just creating a general nuisance of myself.

The responses I received gave me matters to think about and pointers as to where to head to next, which is useful when you’re learning; building on my understanding was a combination of taking in the good advice and information in the slides/videos and asking appropriate questions.

I never felt at any time that if I had tried on my own and had to request for more info that I would not be given some sort of support, be it from someone experienced on the course or Armando the creator himself.

I would also say that experience from network+ and CCNA came in useful, as did some of my previous studies in relation to web technology including HTML, CSS (limited PHP and SQL), a basic understanding of Linux is also helpful.

The challenge of the exam really does focus on expecting you to apply what you learn; I believe this to be an excellent approach. No exam cram sessions on this one I am afraid, if you’re really only looking for another CV filler.

I had good fun and I believe that Armando is building on its success and looking to provide new and interesting experiences for current and potential new students. I look forward to this and hope to continue on as a student/contributor as I learn and have more fun.

If like me you wondered if you had what it took to perform a manual web application penetration test, then this is the one for you!

Details of the course content can be found at http://www.elearnsecurity.com/

Tagged ,