Tuesday, January 30, 2018

Useful .htaccess Tricks and Tips


Some people might not aware of the power of htaccess, Today I will tell you some htaccess methods in this article which I have used or tested it before, and I think some of them are essential tricks and tips to protect your web-server against malicious attacks and other would able to perform simple tasks efficiently such as redirection and web server optimization.

GENERAL SETTINGS:

The following .htaccess will able to help you to achieve simple task such as redirection and web server optimization.

1. SET TIMEZONE

Sometimes, when you using date or mktime function in php, it will show you a funny message regarding timezone. This is one of the way to solve it. Set timezone for your server. A list of supported timezone can be found here
etEnv TZ Australia/Melbourne

2. SEO FRIENDLY 301 PERMANENT REDIRECTS

Why it's SEO friendly? Nowadays, some modern serach engine has the capability to detect 301 Permanent Redirects and update its existing record.
Redirect 301 http://www.onlinetutz.com/home http://www.onlinetutz.com/

3. SKIP THE DOWNLOAD DIALOGUE

Usually when you try to download something from a web server you get a request asking whether you want to save the file or open it. To avoid that you can use the below code on your .htaccess file
AddType application/octet-stream .pdf
AddType application/octet-stream .zip
AddType application/octet-stream .mov

4. SKIP WWW

One of the SEO guideline is, make sure there is only one URL pointing to your website. Therefore, you will need this to redirect all www traffic to non-ww, or the other way around.
RewriteEngine On
RewriteBase /
RewriteCond %{HTTP_HOST} ^www.onlinetutz.com [NC]
RewriteRule ^(.*)$ http://onlinetutz.com/$1 [L,R=301]

5. CUSTOM ERROR PAGE

Create a custom error page for each of the error codes.
ErrorDocument 401 /error/401.php
ErrorDocument 403 /error/403.php
ErrorDocument 404 /error/404.php
ErrorDocument 500 /error/500.php

6. COMPRESS FILES

Optimize your website loading time by compressing files into smaller size.
# compress text, html, javascript, css, xml:
AddOutputFilterByType DEFLATE text/plain
AddOutputFilterByType DEFLATE text/html
AddOutputFilterByType DEFLATE text/xml
AddOutputFilterByType DEFLATE text/css
AddOutputFilterByType DEFLATE application/xml
AddOutputFilterByType DEFLATE application/xhtml+xml
AddOutputFilterByType DEFLATE application/rss+xml
AddOutputFilterByType DEFLATE application/javascript
AddOutputFilterByType DEFLATE application/x-javascript

7. CACHE FILES

File caching is another famous approach in optimizing website loading time
<FilesMatch ".(flv|gif|jpg|jpeg|png|ico|swf|js|css|pdf)$">
Header set Cache-Control "max-age=2592000"
</FilesMatch>

8. DISABLE CACHING FOR CERTAIN FILE TYPE

Well, in the other hand, you can disable caching for certain file type.
# explicitly disable caching for scripts and other dynamic files
<FilesMatch ".(pl|php|cgi|spl|scgi|fcgi)$">
Header unset Cache-Control
</FilesMatch>

SECURITY SETTINGS

The following htaccess code will able to enhance the security level of your webserver. Hotlinking protection is pretty useful to avoid other people using images that stored in your server.

1. HOTLINKING PROTECTION WITH .HTACCESS

Hate it when people stealing bandwidth from your website by using images that are hosted in your web server? Use this, you will able to prevent it from happening.
RewriteBase /
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_REFERER} !^http://(www.)?onlinetutz.com/.*$ [NC]
RewriteRule .(gif|jpg|swf|flv|png)$ /feed/ [R=302,L]

2. PREVENT HACKS

If you want to increase the security level of your website, you can chuck these few lines of codes to prevent some common hacking techniques by detecting malicious URL patterns.
RewriteEngine On

# proc/self/environ? no way!
RewriteCond %{QUERY_STRING} proc/self/environ [OR]

# Block out any script trying to set a mosConfig value through the URL
RewriteCond %{QUERY_STRING} mosConfig_[a-zA-Z_]{1,21}(=|\%3D) [OR]

# Block out any script trying to base64_encode crap to send via URL
RewriteCond %{QUERY_STRING} base64_encode.*(.*) [OR]

# Block out any script that includes a <script> tag in URL
RewriteCond %{QUERY_STRING} (<|%3C).*script.*(>|%3E) [NC,OR]

# Block out any script trying to set a PHP GLOBALS variable via URL
RewriteCond %{QUERY_STRING} GLOBALS(=|[|\%[0-9A-Z]{0,2}) [OR]

# Block out any script trying to modify a _REQUEST variable via URL
RewriteCond %{QUERY_STRING} _REQUEST(=|[|\%[0-9A-Z]{0,2})

# Send all blocked request to homepage with 403 Forbidden error!
RewriteRule ^(.*)$ index.php [F,L]

3. BLOCK ACCESS TO YOUR .HTACCESS FILE

The following code will prevent user to access your .htaccess file. Also, you can block multiple file type as well.
# secure htaccess file
<Files .htaccess>
order allow,deny
deny from all
</Files>

# prevent viewing of a specific file
<Files secretfile.jpg>
 order allow,deny
 deny from all
</Files>

# multiple file types
<FilesMatch ".(htaccess|htpasswd|ini|phps|fla|psd|log|sh)$">
 Order Allow,Deny
 Deny from all
</FilesMatch>

4. RENAME HTACCESS FILES

You can also rename your .htaccess file name to something else to prevent access.
AccessFileName htacc.ess

5. DISABLE DIRECTORY BROWSING

Avoid the server from displaying directory index, or the opposite.
# disable directory browsing
Options All -Indexes

# enable directory browsing
Options All +Indexes

6. CHANGE DEFAULT INDEX PAGE

You can change the default page index.html, index.php or index.htm to something else.
DirectoryIndex business.html

7. BLOCK UNWANTED VISITOR BASED ON REFERRING DOMAIN

# block visitors referred from indicated domains
<IfModule mod_rewrite.c>
 RewriteEngine on
 RewriteCond %{HTTP_REFERER} scumbag.com [NC,OR]
 RewriteCond %{HTTP_REFERER} wormhole.com [NC,OR]
 RewriteRule .* - [F]
 
</ifModule>

8. BLOCKING REQUEST BASED ON USER-AGENT HEADER

This method could save your bandwidth quota by blocking certain bots or spiders from crawling your website.
# block visitors referred from indicated domains
<IfModule mod_rewrite.c>
SetEnvIfNoCase ^User-Agent$ .*(craftbot|download|extract|stripper|sucker|ninja|clshttp|webspider|leacher|collector|grabber|webpictures) HTTP_SAFE_BADBOT
SetEnvIfNoCase ^User-Agent$ .*(libwww-perl|aesop_com_spiderman) HTTP_SAFE_BADBOT
Deny from env=HTTP_SAFE_BADBOT
</ifModule>

9. SECURE DIRECTORIES BY DISABLING EXECUTION OF SCRIPTS

# secure directory by disabling script execution
AddHandler cgi-script .php .pl .py .jsp .asp .htm .shtml .sh .cgi
Options -ExecCGI