This just got a bit more scary because of the release of Firesheep, an add-on for Firefox that sniffs cookies on open wireless networks. It lets you see who is logged into what on the network and hijack their session with one click, no skill required.
It affects any site not using SSL (eg Facebook, Twitter etc). It also affects sites that drop back out of SSL after login has taken place (eg. Amazon, and which is how Impress works I think).
There's a good Security Now podcast on Firesheep (skip to last half hour) or have a look at this blog post. Basically he's saying this will probably force all the big players to move to full time SSL.
See also Aph3x's thread on setting up Impress under SSL.
In the extras folder of the package you will see the files needed to convert the site to use https for all login procedures.
Then in preferences you just toggle on "use ssl for login"
Hi all
I've received an e-mail from a member of my website as follows :
Just noticed that the login page (even when viewed over HTTPS) actually sends the login traffic in-the-clear over the internet. The actual HTML code fragment of relevance is:
Again, with my security hat on, that's horribly bad practice. We'd normally recommend that login pages are viewable over HTTP - but that the actual form submission posts over HTTPS. Post-login, all traffic should be sent over HTTPS - to prevent interception of session cookies.
This is not something that has not been on my 'To Do' list, and his e-mail has motivated me to try and look into this.
I do have https available on my server, and the secure login page does use the https protocol, but a) is the claim that even then the login is sent clear true and if so, why? b) how easy is it to make the whole site, by default, use the https port rather than http? And, 3) what potential issues are there by doing this?
Ta
Ted
In my logs and counter (PHP-Stats) I noticed that some bots (ip's) tried to insert a vulnerably contact.php script. Also bots as MaMa CaSpEr and plaNETWORK.
The following is an extract from WebmasterWorld.com:
From Wizcrafts:
There is an Indonesian based Byroenet IRC vulnerability scanner probing all websites for a vulnerable contact.php script, usually part of Joomla or e107. The attacks use POST to include a remote file and inject hostile codes into exploited websites. The scanner in this instance goes by a variety of hard coded hacking "crew" names, including the following: MaMa CaSpEr, b3b4s Bot Search, dex Bot Search, Dex Bot Search, kmccrew Bot Search, plaNETWORK Bot Search, rk q kangen, sasqia Bot Search, sledink Bot Search, Mozilla/5.0, Mozilla/4.76 [ru] (X11; U; SunOS? 5.7 sun4u), perl post. They will no doubt be adding more user agents from time to time, reflecting new hacking crews.
To protect Apache server websites from these attacks, add the following directives to your root .htaccess. Expect more user agents to come from new crews.
Uncomment the POST condition if you do not allow a direct visitor POST to your blog, via a blog page named contact.php
Same thing happened to sublime when he was running d3forum, he switched it to cbb and the spam just stopped.
Wow! It was 776 when I reported it to him last night - now up to 2000
I've dropped him an email of course to update him.
Well hopefully he will learn and devise a better way, then release a new protector version to combat the problem
Haha!
It can happen to you...
Actually, PHPList documentation says they are sent using sendmail by default. I digged into the docs and found a way to change sending mehtod to SMTP. It's a blind shot, let's see what happens.
BTW, I don't know how to look SMTP server logs, registry or whatever it has. I'm lost on this.
Does PHPList keep track of those emails?
Maybe the SMTP server kept track of those emails
Regards.
I'm having problems with my VPS (WHM+Cpanel) and a mailing list I manage using PHPlist + Xoops/ImpressCMS -> PHPList brigde module. Everything was working fine, but I think the last mailing I sent failed, despite what PHPList report said.
To track the problem, I'd like some way to track all mails received and sent in, say, one hour. Something like:
"Show me a list of all mails sent and received in the last hour, and tell me what happened to them."
How could I do this?
ok then. I just got it wrong .
Thank you for your answer. I didn't know that it doesn't effect the server.
and it was answered.
Either I am not understanding the question - or you aren't understanding the answer.
using allow/deny does not block the server. just blocks http access for those on the deny list.
The question was whether it's possible to allow access to certain files only for the server itself (not for access via the client = brower)
I am not sure which file you are trying to block - but its rather simple to do whatever the file is.
you would need to use an allow deny rule.
ex:
Never mind. Seems not to work anyway.
However, in case someone knows: Please answer anyway. This might be helpful for someone else as well.
I want to restrict access to specific patterns to localhost only.
Current .htaccess:
RewriteEngine On
RewriteRule latest.html latest.php [L]
RewriteRule ^([^/]*)\.xml$ xmlfactory.php?id=$1 [L]
What I want to have is:
- Everybody should be allowed to access latest.html. The second rule however, should only apply for anyone not being localhost. So in case I'm going to visit the website via my browser I should see a 403 Forbidden. In case the server is querying itself, it should get access to the file.
Is it possible to do it with
RewriteCond %{REMOTE_ADDR} somehow?
Please help me
well.
Quote:
Warning
This feature has been DEPRECATED as of PHP 5.3.0 and REMOVED as of PHP 6.0.0. Relying on this feature is highly discouraged.