Ringkasan ini tidak tersedia. Harap
klik di sini untuk melihat postingan.
Senin, 16 Mei 2011
Enjoying TV broadcasts around the world with Internet TV
Internet TV features:
5000 TV channels
Up to date list of channels
Original channel rating system
Display default or full screen mode
No additional equipment required
Channel sorting by rating
Supports multiple formats (Windows Media & RealVideo)
Personalized favorite channels
Filter by flow channel, country, bitrate or content
Channels from more than 121 countries
Easy and simple to use
Free - Gratis
Just download here:
5000 TV channels
Up to date list of channels
Original channel rating system
Display default or full screen mode
No additional equipment required
Channel sorting by rating
Supports multiple formats (Windows Media & RealVideo)
Personalized favorite channels
Filter by flow channel, country, bitrate or content
Channels from more than 121 countries
Easy and simple to use
Free - Gratis
Just download here:
Jumat, 08 April 2011
007 Shell trojan™
Do you know what 007 Shell trojan is and what damage it will bring to your computer? If your computer is infected by the virus now, this article will be very useful for you to get rid of 007 Shell trojan, as it provides all the answers to the questions in your mind.
007 Shell trojan is malicious Trojan horse. Recent the Trojan spread rapidly over the internet and many computers were infected. 007 Shell trojan is so destructive that it can delete system files, steal account password, change system settings as well as bring in spyware and rogue programs to your system.
If your computer is infected by 007 Shell trojan, your PC will malfunction and present various problems, for example,
* Unable to access internet
* Windows update failure
* Missing system files like dll or exe not found errors
* Blue screen of death
It is very difficult to remove 007 Shell trojan manually. However, here is a brief instruction for deleting the Trojan manually.
1. Open Task Manager by press Ctrl+Alt+Del and stop unsecured processes
2. Delete all the related files
3. Remove associated registry value
Notes: the difficulty in manual instruction is that if you do not end all the related processes of 007 Shell trojan, you will have problem with removing its files and registry value, and if you do not delete all the malicious files, the Trojan will come back after system restart.
Most guaranteed way to remove 007 Shell trojan is with the help of a professional antivirus scanner. Maybe the one installed on your system now is unable to pick up or delete the Trojan effectively, but there is at least one security program that will work for the issue. As far as we know, Spyware Cease, a professional antivirus, can remove 007 Shell trojan completely and automatically in a few steps.
1. Download Spyware Cease for free
2. Install and run online scan of Spyware Cease
3. After the scan finish, select all the detected virus and click Remove button
What are you waiting for? Use either manual instruction or Spyware Cease to remove 007 Shell trojan as soon as possible.
007 Shell trojan is malicious Trojan horse. Recent the Trojan spread rapidly over the internet and many computers were infected. 007 Shell trojan is so destructive that it can delete system files, steal account password, change system settings as well as bring in spyware and rogue programs to your system.
If your computer is infected by 007 Shell trojan, your PC will malfunction and present various problems, for example,
* Unable to access internet
* Windows update failure
* Missing system files like dll or exe not found errors
* Blue screen of death
It is very difficult to remove 007 Shell trojan manually. However, here is a brief instruction for deleting the Trojan manually.
1. Open Task Manager by press Ctrl+Alt+Del and stop unsecured processes
2. Delete all the related files
3. Remove associated registry value
Notes: the difficulty in manual instruction is that if you do not end all the related processes of 007 Shell trojan, you will have problem with removing its files and registry value, and if you do not delete all the malicious files, the Trojan will come back after system restart.
Most guaranteed way to remove 007 Shell trojan is with the help of a professional antivirus scanner. Maybe the one installed on your system now is unable to pick up or delete the Trojan effectively, but there is at least one security program that will work for the issue. As far as we know, Spyware Cease, a professional antivirus, can remove 007 Shell trojan completely and automatically in a few steps.
1. Download Spyware Cease for free
2. Install and run online scan of Spyware Cease
3. After the scan finish, select all the detected virus and click Remove button
What are you waiting for? Use either manual instruction or Spyware Cease to remove 007 Shell trojan as soon as possible.
Automatic 007 Shell Trojan removal:
Minggu, 03 April 2011
TrojanHunter®
TrojanHunter searches for and removes trojans from your system. With an easy-to-use Scanner and a Guard that scans in the background TrojanHunter is a must-have complement to your virus scanner. If you are downloading files from the Internet, you need TrojanHunter!
Features:
The current version is 5.3 (Build 994)
TrojanHunter runs on Windows 2000, XP, Vista and Windows 7 (32-bit and 64-bit)

Features:
- High-speed file scan engine capable of detecting modified trojans
- Memory scanning for detecting any modified variant of a particular build of a trojan
- Registry scanning for detecting traces of trojans in the registry
- Inifile scanning for detecting traces of trojans in configuration files
- Port scanning for detecting open trojan ports
- The Advanced Trojan Analyzer, an exclusive feature of TrojanHunter, is able to find whole classes of trojans using advanced scanning techniques
- TrojanHunter Guard for resident memory scanning – detect any trojans if they manage to start up
- LiveUpdate utility for effortless ruleset updating via the Internet
- Process list giving details about every running process on the system, including the path to the actual executable file
- Accurate removal of all detected trojans – even if they are running or if the trojan has injected itself into another process
- Built-in netstat viewer
- Extensive help files
- Free technical support via e-mail
The current version is 5.3 (Build 994)
TrojanHunter runs on Windows 2000, XP, Vista and Windows 7 (32-bit and 64-bit)

how to setting wifi..?

[Quote]
sudo apt-get install language-pack-id - Indonesian sudo apt-get install language-pack-af sudo apt-get install language-pack-fr sudo apt-get install language-pack-ms sudo apt-get install language-pack-nl sudo apt-get install language-pack-ice [/ quote]
Download the latest version of the CakePHP in http://cakephp.org. latest version (April 2010) = cake_1.2.6 Copy, extract the apache directory. Default ubuntu in / var / www
[Quote]
sudo cp-cakephp1x CakePHP-1.2.6-0-gbe7ddfb.tar.gz / var / www cd / var / www sudo tar-xzvf CakePHP-cakephp1x-1.2.6-0-gbe7ddfb.tar.gz [/ quote]
Rename the extracted directory, for more simple (eg cake_1.2.6) Create a symbolic link
[Quote]
cd / var / www sudo ln-s ./cake_1.2.6 ./c2 [/ quote]
Enable some Apache modules: rewrite, deflate to compress, and modify http headers for the header. The module is needed to increase performance and speed of web applications ..
[Quote]
sudo a2enmod rewrite sudo a2enmod deflate sudo a2enmod headers sudo / etc/init.d/apache2 reload [/ quote]
Edit the file "/ etc/php5/apache2/php.ini" On line:
[Quote]
output_buffering = Off [/ quote]
change with
[Quote] Code:
output_buffering = 4096 [/ quote]
Edit the file / etc/apache2/apache2.conf Enter the following script:
[Quote]
AllowOverride All
# --- ---- Compress CONTENT # Place filter 'Deflate' on all outgoing content Deflate SetOutputFilter # Exclude uncompressible content via file type SetEnvIfNoCase Request_URI \. (?: Exe | t? GZ | jpg | png | pdf | zip | bz2 | sit | rar) $ no-gzip # Dont-vary # Keep a log of compression ratio on EACH request DeflateFilterNote Input instream Output DeflateFilterNote outstream DeflateFilterNote Ratio Ratio LogFormat '"% r"% {outstream} n /% {instream} n (% {ratio} n%%)' deflate CustomLog / var/log/apache2/deflate.log deflate # Properly handle old browsers do not support compression That BrowserMatch ^ Mozilla / 4 gzip-only-text/html BrowserMatch ^ Mozilla / 4 \ .0 [678] no-gzip BrowserMatch \ bMSIE! No-gzip! Gzip-only-text/html #------------
# - Expiry ADD DATE -----
Header set Expires "Thu, April 15, 2012 20:00:00 GMT"
#------------
# --- Remove ETags ------- FileETag none #--------------[/ Quote]
Reload Apache
[Quote] sudo / etc/init.d/apache2 reload [/ Quote]
2. Install YFi CakePHP Application Donwload the latest version YFi Cake, I use version yfi_cake-Beta-4.tar.gz extract in the directory "/ var/www/c2".
[Quote] sudo cp yfi_cake-Beta-4.tar.gz / var/www/c2 cd / var/www/c2 sudo tar-xzvf yfi_cake-Beta-4.tar.gz sudo chown-R www-data. / Var/www/c2/yfi_cake/tmp sudo chown-R www-data. / Var/www/c2/yfi_cake/webroot/img/graphics [/ Quote]
Database Settings Create database 'yfi' with the default username and password 'yfi' (default). To use the default setting of this command:
[Quote] mysql-u root-p create database yfi; GRANT ALL PRIVILEGES ON yfi .* to 'yfi' @ '127 .0.0.1 'IDENTIFIED BY' yfi '; GRANT ALL PRIVILEGES ON yfi .* to 'yfi' @ 'localhost' IDENTIFIED BY 'yfi'; exit; [/ quote]
Note: For security you can change the username and password, misalsperti's mine: "submajuli" Do not forget to change the file '/ var/www/c2/yfi_cake/config/database.php', section username and password.
mysql-u root-p yfi / dev / null # Force-add the final rule to fix the routing tables Necessary iptables-I POSTROUTING-t nat-o $ HS_WANIF-j MASQUERADE [/ Quote]
Add coa port on the file / etc / init.d / chilli, look at this section:
[Quote] Opts = "-pidfile / usr / local / var / run / $ NAME.pid" [/ quote]
Add-coaport 3799 for the purpose of "User Kick off"
[Quote] opts = "-pidfile / usr / local / var / run / $ NAME.pid-coaport 3799" # # Add port coa [/ quote]
Login page Here I use a login page coova_json Copy the folder coova_json apache directory "/ var / www"
sudo cp-R / var/www/c2/yfi_cake/setup/coova_json / var / www
Check the file "/ var / www / coova_json / login.php: 1. $ Uamsecret must be equal to the file "/ etc / chilli / config", so also in the file "/ var / www / coova_json / uam.php". The default is 'greatsecret'. 2. $ Port in "/ var / www / coova_json / login.php", if 3660 should be replaced by 3990.
Please restart your ubuntu
1. Tests using the client computer. Setting dhcp client to the network interface, if the client gets the ip 10.1.0.2 - ff. Means coova normal chilli 2. Try browsing to, for example www.google.com. Will exit the splash page 3. Then redirected to the login page: hs_land.php 4. Login with user and password dvdwalt dvdwalt @ ri @ ri. 5. If access is accepted, will appear www.google.com
Finish the installation phase
Pair with Squid Transparent Proxy " Here iptables which I use to meredirec HotSpot client to squid (squid one server with a captive portal)
[Quote] iptables-t nat-A PREROUTING-i tun0-p tcp-s 10.1.0.0/24-d! 10.1.0.1-dport 80-j REDIRECT-to 3128 [/ quote]
good luck remember failure is the beginning of success
Jumat, 01 April 2011
Jadi apa sebenarnya Robots.txt
So what exactly is Robots.txt?
The Robots.txt is just a simple text file that sits in your server and gives useful information to search engine bots as to how they should crawl your site URLs.
The first thing a search engine bot would do before crawling your site is it would look for the ‘Robots.txt’ file located at http://yourdomain.com/robots.txt.
If you don’t have it, they’ll still crawl whatever pages they find, having it just helps make things a bit easier.
Since the bots first check your robots.txt file before crawling anything else, its a good way to instruct them to keep away from indexing any of the pages which you don’t wan’t to be visible on search results.
Caution : Never use the robots.txt to block any premium content from the site. By doing so, the search engines will surely not index those pages but since your robots.txt file is visible to even the normal user (http://yourdomain.com/robots.txt), they’ll be able to get access to this premium content. You may rather place this content behind a login restricting access to it for just the registered users or you may add the ‘noindex’ attribute to the meta tags on that specific page to avoid bots from crawling it.
What does Robots.txt look like??
The average robots.txt could be one of the simplest pieces of code you may write.
If you want to have a robots.txt instructing all search engine bots to crawl everythng they find and don’t want to give any specific instructions use the following piece of code in it.
The ‘User-Agent’ refers to the search engines, since you have an asterix(*) here, it means the instructions are for all engines.
The ‘Disallow’ means what section of the site should not be accessed by the search engine bots. Having nothing after that colon means everything is accessible.
For most of the simple Websites, these two lines are all you need.
If your site is a bit larger and have many folders and so on, you may want to give search engines instructions to avoid some pages.
The best example of this would be to have a printer friendly version of your website located in a specific part, say “printer-ready.” Theres no point in allowing the search engines to index both the same identical parts, so its a good idea to instruct it to avoid the printer-friendly version.
In such a situation, the User-Agent section can be left as it is so that the instructions are given to all the search engines, just a small change needs to be made in the ‘Disallow’ part.
The forward slashes are important before and after the folder name. This folder would be tracked at the end of your domain name. The one above would be read as referring to http://yourdomain.com/printer-ready/
If it’s actually found at www.yourdomain.com/archives/printer-ready/, the robots.txt would have to be formatted in the following way.
You can also change the User-Agent part to give instructions to just a or some specific search engines. Like,
In this case, the folder at http://yourdomain.com/archives/printer-ready would have access by all search engines except Google.
Techfrog’s version of Robots.txt looks like :
Here, we are allowing access to all search engine bots, all content can be indexed and we have mentioned the location where the sitemap can be found so that the search engine bot has access to it and doesn’t leave behind any page while crawling.

How can Robots.txt be put on my site?
Once you have made the robots.txt file, just upload it to the root folder on your server and it will automatically place it in http://yourdomain.com/robots.txt
Isn’t it so Simple?
The Robots.txt is just a simple text file that sits in your server and gives useful information to search engine bots as to how they should crawl your site URLs.
The first thing a search engine bot would do before crawling your site is it would look for the ‘Robots.txt’ file located at http://yourdomain.com/robots.txt.
If you don’t have it, they’ll still crawl whatever pages they find, having it just helps make things a bit easier.
Since the bots first check your robots.txt file before crawling anything else, its a good way to instruct them to keep away from indexing any of the pages which you don’t wan’t to be visible on search results.
Caution : Never use the robots.txt to block any premium content from the site. By doing so, the search engines will surely not index those pages but since your robots.txt file is visible to even the normal user (http://yourdomain.com/robots.txt), they’ll be able to get access to this premium content. You may rather place this content behind a login restricting access to it for just the registered users or you may add the ‘noindex’ attribute to the meta tags on that specific page to avoid bots from crawling it.
What does Robots.txt look like??

If you want to have a robots.txt instructing all search engine bots to crawl everythng they find and don’t want to give any specific instructions use the following piece of code in it.
User-Agent: *
Disallow:
The ‘User-Agent’ refers to the search engines, since you have an asterix(*) here, it means the instructions are for all engines.
The ‘Disallow’ means what section of the site should not be accessed by the search engine bots. Having nothing after that colon means everything is accessible.
For most of the simple Websites, these two lines are all you need.
If your site is a bit larger and have many folders and so on, you may want to give search engines instructions to avoid some pages.
The best example of this would be to have a printer friendly version of your website located in a specific part, say “printer-ready.” Theres no point in allowing the search engines to index both the same identical parts, so its a good idea to instruct it to avoid the printer-friendly version.
In such a situation, the User-Agent section can be left as it is so that the instructions are given to all the search engines, just a small change needs to be made in the ‘Disallow’ part.
User-Agent: *
Disallow: /printer-ready/
The forward slashes are important before and after the folder name. This folder would be tracked at the end of your domain name. The one above would be read as referring to http://yourdomain.com/printer-ready/
If it’s actually found at www.yourdomain.com/archives/printer-ready/, the robots.txt would have to be formatted in the following way.
User-Agent: *
Disallow: /archives/printer-ready/
You can also change the User-Agent part to give instructions to just a or some specific search engines. Like,
User-Agent: googlebot
Disallow: /archives/printer-ready/
In this case, the folder at http://yourdomain.com/archives/printer-ready would have access by all search engines except Google.
Techfrog’s version of Robots.txt looks like :
User-agent: *
Disallow:
Here, we are allowing access to all search engine bots, all content can be indexed and we have mentioned the location where the sitemap can be found so that the search engine bot has access to it and doesn’t leave behind any page while crawling.

How can Robots.txt be put on my site?
Once you have made the robots.txt file, just upload it to the root folder on your server and it will automatically place it in http://yourdomain.com/robots.txt
Isn’t it so Simple?
mengganti logo google
assalamualaikum
bosen dengan tampilan google itu-itu aja hmmm
sekarang saya share mengganti logo google sesuai keinginan kalian
excample:
ok cekidot om
step one
sumber www.google.com
bosen dengan tampilan google itu-itu aja hmmm
sekarang saya share mengganti logo google sesuai keinginan kalian
excample:
ok cekidot om
step one
- lngsung aja http://funnylogo.info/create.asp
- setelah itu Lalu tinggal masukan aja nama yang sesuai dengan pilihan Hatimu
- Pilih stylenya
- bersenang-senang lah dengan tampilan baru googlemu
sumber www.google.com
Langganan:
Postingan (Atom)