Re: Concerns about the use and lack of use of HTTPS with ICMS

This just got a bit more scary because of the release of Firesheep, an add-on for Firefox that sniffs cookies on open wireless networks. It lets you see who is logged into what on the network and hijack their session with one click, no skill required.

It affects any site not using SSL (eg Facebook, Twitter etc). It also affects sites that drop back out of SSL after login has taken place (eg. Amazon, and which is how Impress works I think).

There's a good Security Now podcast on Firesheep (skip to last half hour) or have a look at this blog post. Basically he's saying this will probably force all the big players to move to full time SSL.

See also Aph3x's thread on setting up Impress under SSL.

Topic


Re: Concerns about the use and lack of use of HTTPS with ICMS

  • 2010/11/12 17:06:30
  • Will

In the extras folder of the package you will see the files needed to convert the site to use https for all login procedures.

Then in preferences you just toggle on "use ssl for login"

Topic


Concerns about the use and lack of use of HTTPS with ICMS

Hi all

I've received an e-mail from a member of my website as follows :


Just noticed that the login page (even when viewed over HTTPS) actually sends the login traffic in-the-clear over the internet. The actual HTML code fragment of relevance is:

<form style="margin-top: 0px;" action="http://www.mysite.com/user.php" method="post">


Again, with my security hat on, that's horribly bad practice. We'd normally recommend that login pages are viewable over HTTP - but that the actual form submission posts over HTTPS. Post-login, all traffic should be sent over HTTPS - to prevent interception of session cookies.


This is not something that has not been on my 'To Do' list, and his e-mail has motivated me to try and look into this.

I do have https available on my server, and the secure login page does use the https protocol, but a) is the claim that even then the login is sent clear true and if so, why? b) how easy is it to make the whole site, by default, use the https port rather than http? And, 3) what potential issues are there by doing this?

Ta

Ted



Block MaMa CaSpEr and Bot Search in .htaccess

In my logs and counter (PHP-Stats) I noticed that some bots (ip's) tried to insert a vulnerably contact.php script. Also bots as MaMa CaSpEr and plaNETWORK.

The following is an extract from WebmasterWorld.com:

From Wizcrafts:
There is an Indonesian based Byroenet IRC vulnerability scanner probing all websites for a vulnerable contact.php script, usually part of Joomla or e107. The attacks use POST to include a remote file and inject hostile codes into exploited websites. The scanner in this instance goes by a variety of hard coded hacking "crew" names, including the following: MaMa CaSpEr, b3b4s Bot Search, dex Bot Search, Dex Bot Search, kmccrew Bot Search, plaNETWORK Bot Search, rk q kangen, sasqia Bot Search, sledink Bot Search, Mozilla/5.0, Mozilla/4.76 [ru] (X11; U; SunOS? 5.7 sun4u), perl post. They will no doubt be adding more user agents from time to time, reflecting new hacking crews.

To protect Apache server websites from these attacks, add the following directives to your root .htaccess. Expect more user agents to come from new crews.

Uncomment the POST condition if you do not allow a direct visitor POST to your blog, via a blog page named contact.php

# RewriteCond %{THE_REQUEST} ^POST\ /your blog directory/.*contact\.php [OR] RewriteCond %{HTTP_USER_AGENT} ^MaMa|plaNETWORK|dex| [NC,OR] RewriteCond %{HTTP_USER_AGENT} Bot\ Search|casper|crew|kangen [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Mozilla/5\.0|perl\ post$ [OR] RewriteCond %{HTTP_USER_AGENT} ^Mozilla/4\.76\ \[ru]\ \(X11;\ U;\ SunOS\?\ 5\.7\ sun4u\)$ RewriteRule .* - [F]


Change the RewriteRule to include a custom 403 page, if you use one.
Example with a custom 403 page in the web root:

RewriteRule !^403\.(s?html|php)$ - [F]


Note. There is no reason to allow robots.txt when forbidding hack tools.

If you do have a page named contact.php, make sure you examine the code for security checks against remote file inclusion (RFI) exploits. Or, rename that file and change the links to it (then have it checked for vulnerabilities)!

Get the latest version of any CMS or blog software you have installed on your server or website. This specifically includes Joomla and e107 CMS scripts! Plus, Check your web root directory for the presence of a file containing the name "casper" or anything ending in .pl that you didn't put there.


Reply from Wizcrafts:
Here are more .htaccess directives pertaining to exploits used by this hacking gang and other similar to them.
# Mod_Access block-rule: <Files *> order deny,allow # Block Indonesia deny from 110.136.176.0/20 118.96.0.0/15 125.164.64.0/19 </Files> Options +FollowSymLinks RewriteEngine On RewriteOptions inherit RewriteBase / RewriteCond %{HTTP_USER_AGENT} ^libwww-perl/ [OR] RewriteCond %{THE_REQUEST} _inject%20 [NC,OR] RewriteCond %{THE_REQUEST} ^POST\ .*/e107\ HTTP/1\.[01]$ [OR] RewriteCond %{QUERY_STRING} ^sIncPath=%7Cecho [OR] RewriteCond %{QUERY_STRING} ^sIncPath=http://.+\.fileave\.com/ RewriteRule .* - [F]



Re: GIJOE owned

  • 2010/2/6 8:35:59
  • Will

Same thing happened to sublime when he was running d3forum, he switched it to cbb and the spam just stopped.



Re: GIJOE owned

  • 2010/2/3 16:14:53
  • david

The issue has been dealt with apparently



Re: GIJOE owned

  • 2010/2/3 13:45:40
  • david

Wow! It was 776 when I reported it to him last night - now up to 2000

I've dropped him an email of course to update him.



Re: GIJOE owned

  • 2010/2/3 12:47:43
  • Tom

Well hopefully he will learn and devise a better way, then release a new protector version to combat the problem



Re: GIJOE owned

Haha!

It can happen to you...

If you can't understand what I'm saying, you're not geek enough
ISegura.es


GIJOE owned

Even the Protector guru isn't safe for spammers...



Re: Track my VPS mailing service?

Actually, PHPList documentation says they are sent using sendmail by default. I digged into the docs and found a way to change sending mehtod to SMTP. It's a blind shot, let's see what happens.

BTW, I don't know how to look SMTP server logs, registry or whatever it has. I'm lost on this.

If you can't understand what I'm saying, you're not geek enough
ISegura.es


Re: Track my VPS mailing service?

Does PHPList keep track of those emails?
Maybe the SMTP server kept track of those emails



Track my VPS mailing service?

Regards.

I'm having problems with my VPS (WHM+Cpanel) and a mailing list I manage using PHPlist + Xoops/ImpressCMS -> PHPList brigde module. Everything was working fine, but I think the last mailing I sent failed, despite what PHPList report said.

To track the problem, I'd like some way to track all mails received and sent in, say, one hour. Something like:

"Show me a list of all mails sent and received in the last hour, and tell me what happened to them."

How could I do this?

If you can't understand what I'm saying, you're not geek enough
ISegura.es


Re: Restrict access via htaccess based on host

ok then. I just got it wrong .
Thank you for your answer. I didn't know that it doesn't effect the server.

the german icms website : www.impresscms.de


Re: Restrict access via htaccess based on host

  • 2010/1/9 13:55:29
  • Will

and it was answered.

Either I am not understanding the question - or you aren't understanding the answer.

using allow/deny does not block the server. just blocks http access for those on the deny list.



Re: Restrict access via htaccess based on host

The question was whether it's possible to allow access to certain files only for the server itself (not for access via the client = brower)

the german icms website : www.impresscms.de


Re: Restrict access via htaccess based on host

  • 2010/1/9 12:20:19
  • Will

I am not sure which file you are trying to block - but its rather simple to do whatever the file is.

you would need to use an allow deny rule.
ex:

<Files ~ "^\.xml"> Order allow,deny Deny from all </Files>


You can do specific filenames whatever.

you can also allow say only certain refers to access.
<Files ~ "^\.xml"> Order allow,deny Deny from all Allow from .somedomain.com Allow from .someotherdomain.com </Files>


Check it

HTH



Re: Restrict access via htaccess based on host

Never mind. Seems not to work anyway.
However, in case someone knows: Please answer anyway. This might be helpful for someone else as well.

the german icms website : www.impresscms.de


Restrict access via htaccess based on host

I want to restrict access to specific patterns to localhost only.

Current .htaccess:
RewriteEngine On
RewriteRule latest.html latest.php [L]
RewriteRule ^([^/]*)\.xml$ xmlfactory.php?id=$1 [L]

What I want to have is:
- Everybody should be allowed to access latest.html. The second rule however, should only apply for anyone not being localhost. So in case I'm going to visit the website via my browser I should see a 403 Forbidden. In case the server is querying itself, it should get access to the file.

Is it possible to do it with
RewriteCond %{REMOTE_ADDR} somehow?

Please help me



Re: Magic quotes: Tool of Satan or safety net for imbys?

  • 2009/9/15 9:25:14
  • Will

well.

Quote:

Warning

This feature has been DEPRECATED as of PHP 5.3.0 and REMOVED as of PHP 6.0.0. Relying on this feature is highly discouraged.



Anything that uses the word "automagically" in its description - needs to be shot and drug behind a car for a few weeks.

http://us.php.net/magic_quotes




 Top