Inurl: viewerframe mode

A few tips for the newcomers on this sub !

2023.10.28 11:18 SubliminalPoet A few tips for the newcomers on this sub !

This post is mainly intended to help the people who discover this sub to start with. It could also be useful for the other folks, who knows ?

What is an open directory ?

Open directories (aka ODs or opendirs) are just unprotected websites that you can browse recursively, without any required authentication. You can freely download individual files from them. They're organized in a folder structure, as a local directory tree on your computer. This is really convenient as you can also download several files in a bunch recursively (See below).
These sites are sometimes deliberately let open and, sometimes, inadvertently (seedboxes, personal websites with some dirs bad protected, ...). For these last ones, often, after someone has posted them here, they're hammered by many concurrent downloads and they're getting down due to this heavy load. When the owners are aware about it, they usually decide to protect them behind a firewall or to ask for a password to limit their access.
Here is coming the famous "He's dead Jim!" flair.
Technically, an opendir is nothing more than a local directory, shared by a running web server:
cd my_dir # Share a dir with python python -m http.server # With Javascript npm install -g http-server http-server . # Open your browser on http://localhost or http:// from another computer. # Usually you should use a web server like Apache or Nginx with extra settings # You also need to configure your local network to make it accessible from the Internet. 

How to find interesting stuff ?

Your first reflex should be to track the most recent posts of the sub. If you're watchful, there's always a comment posted with some details like this one and you can get the complete list of links for your shopping ("Urls file" link). You can still index a site by your own if the link of the "Url file" is broken or if the content has changed, with KoalaBear84's Indexer.
Thanks to the hard work of some folks, you can invoke a servile bot: u/ODScanner to generate this report. By the past, u/KoalaBear84 devoted to this job. Although some dudes told us he is a human being, I don't believe them ;-)
You should also probably take a look at "The Eye" too, a gigantic opendir maintained by archivists. Their search engine seems to be broken currently, but you can use alternative search engines, like Eyedex for instance.
Are you looking for a specific file ? Some search engines are indexing the opendirs posted here and are almost updated in realtime:
Don't you think that clicking on every posts and checking them one by one is a bit cumbersome ? There is a good news for you: With this tip you can get a listing of all the working dirs.

Any way to find some new ODs by myself ?

Yes you can !
The most usual solution starts with the traditional search engines or meta-engines (Google, Bing, DuckDuckGo ...) by using an advanced syntax as for this example%20-inurl:(jspplphphtmlaspxhtmcfshtml)). Opendirs are just some classical sites after all.
If you're lazy, there are plethora of frontends to these engines which are able to assist you in building the perfect query and to redirect to them. Here is my favorite.
As an alternative, often complementary, you can use IoT (Internet of Things) search engines like Shodan, Zoomeye, Censys and Fofa . To build their index, their approach is totally different from the other engines. Rather than crawling all the Web across hyperlinks, they scan every ports across all the available IP adresses and, for the HTTP servers, they just index their homepage. Here is an equivalent example.

I'd like to share one. Some advice ?

Just respect the code of conduct. All the rules are listed on the side panel of the sub.
Maybe one more point though. Getting the same site reposted many times in a small period increases the signal/noise ratio. A repost of an old OD with a different content is accepted but try to keep a good balance. For finding duplicates, the reddit search is not very relevant, so here are 2 tips:
  1. Using the KolaBear84's page
  2. With a Google search: site:reddit.com/opendirectories my_url
Why could we not post some torrent files, mega links or obfuscated links ... ?
The short answer: They're simply not real opendirs.
A more elaborated answer:
These types of resources are often associated to piracy, monitored, and Reddit`s admins have to forward the copyright infringement notices to the mods of the sub. When it's too repetitive the risk is to get the sub closed as it was the case for this famous one.
For the obfuscation (Rule 5), with base64 encoding for instance, the POV of the mods is that they do prefer to accept urls in clear and dealing with the rare DMCA`s notices. They're probably automated and the sub remains under the human radar. It won't be the case anymore with obfuscation techniques.
There are some exceptions however:
Google drives and Calibre servers (ebooks) are tolerated. For the gdrives, there is no clear answer, but it may be because we could argue that these dirs are generally not deliberately open for piracy.
Calibre servers are not real ODs but you can use the same tools to download their content. By the past a lot of them were posted and some people started to complain against that. A new sub has been created but is not very active as a new player has coming into the game : Calishot, a search engine with a monthly update.

I want to download all the content in a bunch. How to do it ?

You have to use an appropriate tool. An exhaustive list would probably require a dedicated post.
For your choice, you may consider different criteria. Here are some of them:
Here is an overview of the main open source/free softs for this purpose.
Note: Don't consider this list as completely reliable as I didn't test all of them.
Concurrent downloadsAble to preserve the original treeClient/Server modeCLITUIGUIWeb UIBrowser pluginwgetNYNY??Y?wget2YYNY????aria2YNYYY?Y?rcloneYYNY??Y?IDMYNNNNYNNJDownloader2YNYNNYNN
Here is my own path:
# To download an url recursively wget -r -nc --no-parent -l 200 -e robots=off -R "index.html*" -x http://111.111.111.111 # Sometimes I want to filter the list of files before the download. # Start by indexing the files OpenDirectoryDownloader -t 10 -u http://111.111.111.111 # A new file is created: Scans/http:__111.111.111.111_.txt # Now I'm able to filter out the list of links with my favourite editor or with grep/egrep egrep -o -e'^*\.(epubpdfmobiopfcover\.jpg)$' >> files.txt # Then I can pass this file as an input for wget and preserve the directory structure wget -r -nc -c --no-parent -l 200 -e robots=off -R "index.html*" -x --no-check-certificate -i file.txt 
submitted by SubliminalPoet to myODs [link] [comments]


2021.05.22 19:20 krazybug A few tips for the newcomers on this sub !

Slava Ukraini !


https://preview.redd.it/xgx9e3cjmuj81.jpg?width=281&format=pjpg&auto=webp&s=0a56667ca743e77f2f8c3daa7bba6efe3432355f
This post is mainly intended to help the people who discover this sub to start with. It could also be useful for the other folks, who knows ?

What is an open directory ?

Open directories (aka ODs or opendirs) are just unprotected websites that you can browse recursively, without any required authentication. You can freely download individual files from them. They're organised in a folder structure, as a local directory tree on your computer. This is really convenient as you can also download several files in a bunch recursively (See below).
These sites are sometimes deliberately let open and, sometimes, inadvertently (seedboxes, personal websites with some dirs bad protected, ...). For these last ones, often, after someone has posted them here, they're hammered by many concurrent downloads and they're getting down due to this heavy load. When the owners do realise it, they usually decide to protect them behind a firewall or to ask for a password to limit their access.
Here is coming the famous "He's dead Jim!" flair.
Technically, an opendir is nothing more than a local directory, shared by a running web server:
cd my_dir # Share a dir with python python -m SimpleHTTPServer # With Javascript npm install -g http-server http-server . # Open your browser on http://localhost or http:// from another computer. # Usually you should use a web server like Apache or Nginx with extra settings # You also need to configure your local network to make it accessible from the Internet. 

How to find interesting stuff ?

Your first reflex should be to track the most recent posts of the sub. If you're watchful, there's always a comment posted with some details like this one and you can get the complete list of links for your shopping ("Urls file" link). You can still index a site by your own if the link of the "Url file" is broken or if the content has changed, with KoalaBear84's Indexer.
Thanks to the hard work of some folks, you can invoke a servile bot: u/ODScanner to generate this report. By the past, u/KoalaBear84 devoted to this job. Although some dudes told us he is a human being, I don't believe them ;-)
You should also probably take a look at "The Eye" too, a gigantic opendir maintained by archivists. Their search engine seems to be broken currently, but you can use alternative search engines, like Eyedex for instance.
Are you looking for a specific file ? Some search engines are indexing the opendirs posted here and are almost updated in realtime:
Don't you think that clicking on every posts and checking them one by one is a bit cumbersome ? There is a good news for you: With this tip you can get a listing of all the working dirs.

Any way to find some new ODs by myself ?

Yes you can !
The most usual solution starts with the traditional search engines or meta-engines (Google, Bing, DuckDuckGo ...) by using an advanced syntax as for this example%20-inurl:(jspplphphtmlaspxhtmcfshtml)). Opendirs are just some classical sites after all.
If you're lazy, there are plethora of frontends to these engines which are able to assist you in building the perfect query and to redirect to them. Here is my favorite.
As an alternative, often complementary, you can use IoT (Internet of Things) search engines like Shodan, Zoomeye, Censys and Fofa . To build their index, their approach is totally different from the other engines. Rather than crawling all the Web across hyperlinks, they scan every ports across all the available IP adresses and, for the HTTP servers, they just index their homepage. Here is an equivalent example.

I'd like to share one. Some advice ?

Just respect the code of conduct. All the rules are listed on the side panel of the sub.
Maybe one more point though. Getting the same site reposted many times in a small period increases the signal/noise ratio. A repost of an old OD with a different content is accepted but try to keep a good balance. For finding duplicates, the reddit search is not very relevant, so here are 2 tips:
  1. Using the KolaBear84's page
  2. With a Google search: site:reddit.com/opendirectories my_url

Why could we not post some torrent files, mega links or obfuscated links ... ?

The short answer: They're simply not real opendirs.
A more elaborated answer:
These types of resources are often associated to piracy, monitored, and Reddit`s admins have to forward the copyright infringement notices to the mods of the sub. When it's too repetitive the risk is to get the sub closed as it was the case for this famous one.
For the obfuscation (Rule 5), with base64 encoding for instance, the POV of the mods is that they do prefer to accept urls in clear and dealing with the rare DMCA`s notices. They're probably automated and the sub remains under the human radar. It won't be the case anymore with obfuscation techniques.
There are some exceptions however:
Google drives and Calibre servers (ebooks) are tolerated. For the gdrives, there is no clear answer, but it may be because we could argue that these dirs are generally not deliberately open for piracy.
Calibre servers are not real ODs but you can use the same tools to download their content. By the past a lot of them were posted and some people started to complain against that. A new sub has been created but is not very active as a new player has coming into the game : Calishot, a search engine with a monthly update.

I want to download all the content in a bunch. How to do it ?

You have to use an appropriate tool. An exhaustive list would probably require a dedicated post.
For your choice, you may consider different criteria. Here are some of them:
Here is an overview of the main open source/free softs for this purpose.
Note: Don't consider this list as completely reliable as I didn't test all of them.
Concurrent downloads Able to preserve the original tree Client/Server mode CLI TUI GUI Web UI Browser plugin
wget N Y N Y ? ? Y ?
wget2 Y Y N Y ? ? ? ?
aria2 Y N Y Y Y ? Y ?
rclone Y Y N Y ? ? Y ?
IDM Y N N N N Y N N
JDownloader2 Y N Y N N Y N N
Here is my own path:
# To download an url recursively wget -r -nc --no-parent -l 200 -e robots=off -R "index.html*" -x http://111.111.111.111 # Sometimes I want to filter the list of files before the download. # Start by indexing the files OpenDirectoryDownloader -t 10 -u http://111.111.111.111 # A new file is created: Scans/http:__111.111.111.111_.txt # Now I'm able to filter out the list of links with my favourite editor or with grep/egrep egrep -o -e'^*\.(epubpdfmobiopfcover\.jpg)$' >> files.txt # Then I can pass this file as an input for wget and preserve the directory structure wget -r -nc -c --no-parent -l 200 -e robots=off -R "index.html*" -x --no-check-certificate -i file.txt 

Conclusion:

Welcome on board and Kudos to all the contributors, especially to the most involved: u/KoalaBear84, u/Chaphasilor, u/MCOfficer u/ringofyre
submitted by krazybug to opendirectories [link] [comments]


2019.08.05 20:34 spookex Find unsecured cameras using google

Original comment by u/ renakhys
For example you can try to google:
inurl:ViewerFrame?Mode=
and you will be able to see random people's cameras...
submitted by spookex to spxsavedlinks [link] [comments]


2019.01.03 22:46 sg4_mememaster WHY IS THIS A THING

Recently i have stumbled onto a youtube video called "5 Sneaky Ways Cities Don't Control Our Behavior " link:https://www.youtube.com/watch?v=x1DubVqWx4I&t=83s and he showed a url and said "the cities have cameras for a reason, an that url he showed was this: (inurl:ViewerFrame?Mode=) and when you look this up the first ink you find is this: http://184.183.28.12:4000/ViewerFrame?Mode=Motion&Language=0 and this link very creepily brings you to a camera where you can control where it looks and everything, i dont know why this is a thing but it should'nt be
submitted by sg4_mememaster to Mysteries [link] [comments]


2015.02.23 07:48 ILL_BE_WATCHING_YOU Here's a list of useful Google search queries.

*inurl:"CgiStart?page="
submitted by ILL_BE_WATCHING_YOU to illbewatchingyou [link] [comments]


2014.06.26 17:52 h-v-smacker How I compiled a list of all woodsmithshop.com free plans

I've been looking for plans and ideas for a small router table, and came across several very good materials from woodsmithshop.com. So I became interested in "how many are there" and my hoarder mode activated. I tried googling with "inurl:www.woodsmithshop/download", but working with files one-by-one tired me out after the first dozen. So I automated this process. After all, there are over 150 of them.
UPD: Some (wink-wink, nudge-nudge) people complain that this is bypassing e-mail subscription and is very bad "in spirit". So don't forget to sign up for those. I recommend 10 Minute Mail, a trustworthy email provider, the duration of its services should be more than sufficient to cover the time needed to complete the download.
First, I decided to use the actual source and not google. So, to extract all the PDF's URLs from the download page, I used:
wget -qO- 'http://www.woodsmithshop.com/episodes/downloads/' grep -o -e '/download/[0-9]\+/.\+\.pdf' sed 's/^/http:\/\/www.woodsmithshop.com/' > pdf-list.txt
The result should be like this: http://pastebin.com/RjWCRdyx http://pastebin.com/m5jrtzcx
Then, to download all the files in the list,
more pdf-list.txt xargs -n1 wget
If you don't have a *NIX OS handy (as should be obvious by now, all that command line stuff clearly labels me as a Linux user), you can just grab the list from the pastebin and feed it into any download manager that would accept a simple text file with a URL per line.
UPD: Changed the one-liner and added another pastebin result to include 'http://' before URLs. Now any wget user can just save the list from http://pastebin.com/m5jrtzcx to a text file (say, pdf-list.txt) and run
wget -i pdf-list.txt
to download them. The download manager wget is available for any platform, and it can use a list of URLs without any additional stuff.
I've been looking through these files over the previous day, and definitely feel enriched and motivated by it. I hope someone would find this useful.
UPD2: now that Season 8 files are out, one can quickly make a list of only those by running:
wget -qO- 'http://www.woodsmithshop.com/episodes/downloads/' grep -o -e '/download/8[0-9]\+/.\+\.pdf' sed 's/^/http:\/\/www.woodsmithshop.com/' > pdf-list.txt
submitted by h-v-smacker to woodworking [link] [comments]


2014.03.17 04:35 Phant0m_Cat See what interesting things you'll find.

Use Google to search for this:
inurl:"ViewerFrame?Mode=" 
I found giraffes!
submitted by Phant0m_Cat to woahdude [link] [comments]


2013.06.14 21:12 waysofmylife (Part 2 of 14) The Power of Google - How to Search Google for Hidden Secrets - Files Containing Important Information

This is part 2 of 14 - You can type these directly into Google to find interesting things, do vulnerability test, find contacts, music, and etc.
Files Containing Important Information
intitle:"DocuShare" inurl:"docushare/dsweb/" -faq -gov -edu
"#mysql dump" filetype:sql
"#mysql dump" filetype:sql 21232f297a57a5a743894a0e4a801fc3
"allow_call_time_pass_reference" "PATH_INFO"
"Certificate Practice Statement" inurl:(PDF DOC)
"Generated by phpSystem"
"generated by wwwstat"
"Host Vulnerability Summary Report"
"HTTP_FROM=googlebot" googlebot.com "Server_Software="
"Index of" / "chat/logs"
"Installed Objects Scanner" inurl:default.asp
"MacHTTP" filetype:log inurl:machttp.log
"Mecury Version" "Infastructure Group"
"Microsoft (R) Windows * (TM) Version * DrWtsn32 Copyright (C)" ext:log
"Most Submitted Forms and Scripts" "this section"
"Network Vulnerability Assessment Report"
"not for distribution" confidential
"not for public release" -.edu -.gov -.mil
"phone * * *" "address *" "e-mail" intitle:"curriculum vitae"
"phpMyAdmin" "running on" inurl:"main.php"
"produced by getstats"
"Request Details" "Control Tree" "Server Variables"
"robots.txt" "Disallow:" filetype:txt
"Running in Child mode"
"sets mode: +p"
"sets mode: +s"
"Thank you for your order" +receipt
"This is a Shareaza Node"
"This report was generated by WebLog"
( filetype:mail filetype:eml filetype:mbox filetype:mbx ) intext:passwordsubject
(intitle:"PRTG Traffic Grapher" inurl:"allsensors")(intitle:"PRTG Traffic Grapher - Monitoring Results")
(intitle:WebStatistica inurl:main.php) (intitle:"WebSTATISTICA server") -
inurl:statsoft -inurl:statsoftsa -inurl:statsoftinc.com -edu -software -rob
(inurl:"robot.txt" inurl:"robots.txt" ) intext:disallow filetype:txt
+":8080" +":3128" +":80" filetype:txt
+"HSTSNR" -"netop.com"
-site:php.net -"The PHP Group" inurl:source inurl:url ext:pHp
94FBR "ADOBE PHOTOSHOP"
AIM buddy lists
allinurl:/examples/jsp/snp/snoop.jsp
allinurl:cdkey.txt
allinurl:servlet/SnoopServlet
cgiirc.conf
cgiirc.conf
contacts ext:wml
data filetype:mdb -site:gov -site:mil
exported email addresses
ext:(doc pdf xls txt ps rtf odt sxw psw ppt pps xml) (intext:confidential salary intext:"budget approved") inurl:confidential
ext:asp inurl:pathto.asp
ext:ccm ccm -catacomb
ext:CDX CDX
ext:cgi inurl:editcgi.cgi inurl:file=
ext:conf inurl:rsyncd.conf -cvs -man
ext:conf NoCatAuth -cvs
ext:dat bpk.dat
ext:DBF DBF
ext:DCA DCA
ext:gho gho
ext:ics ics
ext:ini intext:env.ini
ext:jbf jbf
ext:ldif ldif
ext:log "Software: Microsoft Internet Information Services ."
ext:mdb inurl:*.mdb inurl:fpdb shop.mdb
ext:nsf nsf -gov -mil
ext:plist filetype:plist inurl:bookmarks.plist
ext:pqi pqi -database
ext:reg "username=*" putty
ext:txt "Final encryption key"
ext:txt inurl:dxdiag
ext:vmdk vmdk
ext:vmx vmx
filetype:asp DBQ=" * Server.MapPath("*.mdb")
filetype:bkf bkf
filetype:blt "buddylist"
filetype:blt blt +intext:screenname
filetype:cfg auto_inst.cfg
filetype:cnf inurl:_vti_pvt access.cnf
filetype:conf inurl:firewall -intitle:cvs
filetype:config web.config -CVS
filetype:ctt Contact
filetype:ctt ctt messenger
filetype:eml eml +intext:"Subject" +intext:"From" +intext:"To"
filetype:fp3 fp3
filetype:fp5 fp5 -site:gov -site:mil -"cvs log"
filetype:fp7 fp7
filetype:inf inurl:capolicy.inf
filetype:lic lic intext:key
filetype:log access.log -CVS
filetype:log cron.log
filetype:mbx mbx intext:Subject
filetype:myd myd -CVS
filetype:ns1 ns1
filetype:ora ora
filetype:ora tnsnames
filetype:pdb pdb backup (Pilot Pluckerdb)
filetype:php inurl:index inurl:phpicalendar -site:sourceforge.net
filetype:pot inurl:john.pot
filetype:PS ps
filetype:pst inurl:"outlook.pst"
filetype:pst pst -from -to -date
filetype:qbb qbb
filetype:QBW qbw
filetype:rdp rdp
filetype:reg "Terminal Server Client"
filetype:vcs vcs
filetype:wab wab
filetype:xls -site:gov inurl:contact
filetype:xls inurl:"email.xls"
Financial spreadsheets: finance.xls
Financial spreadsheets: finances.xls
Ganglia Cluster Reports
haccess.ctl (one way)
haccess.ctl (VERY reliable)
ICQ chat logs, please...
intext:"Session Start * * * ::* *" filetype:log
intext:"Tobias Oetiker" "traffic analysis"
intext:(password passcode) intext:(username userid user) filetype:csv
intext:gmail invite intext:http://gmail.google.com/gmail/a
intext:SQLiteManager inurl:main.php
intext:ViewCVS inurl:Settings.php
intitle:"admin panel" +"Powered by RedKernel"
intitle:"Apache::Status" (inurl:server-status inurl:status.html inurl:apache.html)
intitle:"AppServ Open Project" -site:www.appservnetwork.com
intitle:"ASP Stats Generator ." "ASP Stats Generator" "2003-2004 weppos"
intitle:"Big Sister" +"OK Attention Trouble"
intitle:"curriculum vitae" filetype:doc
intitle:"edna:streaming mp3 server" -forums
intitle:"FTP root at"
intitle:"index of" +myd size
intitle:"Index Of" -inurl:maillog maillog size
intitle:"Index Of" cookies.txt size
intitle:"index of" mysql.conf OR mysql_config
intitle:"Index of" upload size parent directory
intitle:"index.of *" admin news.asp configview.asp
intitle:"index.of" .diz .nfo last modified
intitle:"Joomla - Web Installer"
intitle:"LOGREP - Log file reporting system" -site:itefix.no
intitle:"Multimon UPS status page"
intitle:"PHP Advanced Transfer" (inurl:index.php inurl:showrecent.php )
intitle:"PhpMyExplorer" inurl:"index.php" -cvs
intitle:"statistics of" "advanced web statistics"
intitle:"System Statistics" +"System and Network Information Center"
intitle:"urchin (53admin)" ext:cgi
intitle:"Usage Statistics for" "Generated by Webalizer"
intitle:"wbem" compaq login "Compaq Information Technologies Group"
intitle:"Web Server Statistics for ****"
intitle:"web server status" SSH Telnet
intitle:"Welcome to F-Secure Policy Manager Server Welcome Page"
intitle:"welcome.to.squeezebox"
intitle:admin intitle:login
intitle:Bookmarks inurl:bookmarks.html "Bookmarks
intitle:index.of "Apache" "server at"
intitle:index.of cleanup.log
intitle:index.of dead.letter
intitle:index.of inbox
intitle:index.of inbox dbx
intitle:index.of ws_ftp.ini
intitle:intranet inurl:intranet +intext:"phone"
inurl:"/axs/ax-admin.pl" -script
inurl:"/cricket/grapher.cgi"
inurl:"bookmark.htm"
inurl:"cacti" +inurl:"graph_view.php" +"Settings Tree View" -cvs -RPM
inurl:"newsletteadmin/"
inurl:"newsletteadmin/" intitle:"newsletter admin"
inurl:"putty.reg"
inurl:"smb.conf" intext:"workgroup" filetype:conf conf
inurl:*db filetype:mdb
inurl:/cgi-bin/pass.txt
inurl:/_layouts/settings
inurl:admin filetype:xls
inurl:admin intitle:login
inurl:backup filetype:mdb
inurl:build.err
inurl:cgi-bin/printenv
inurl:cgi-bin/testcgi.exe "Please distribute TestCGI"
inurl:changepassword.asp
inurl:ds.py
inurl:email filetype:mdb
inurl:fcgi-bin/echo
inurl:forum filetype:mdb
inurl:forward filetype:forward -cvs
inurl:getmsg.html intitle:hotmail
inurl:log.nsf -gov
inurl:main.php phpMyAdmin
inurl:main.php Welcome to phpMyAdmin
inurl:netscape.hst
inurl:netscape.hst
inurl:netscape.ini
inurl:odbc.ini ext:ini -cvs
inurl:perl/printenv
inurl:php.ini filetype:ini
inurl:preferences.ini "[emule]"
inurl:profiles filetype:mdb
inurl:report "EVEREST Home Edition "
inurl:server-info "Apache Server Information"
inurl:server-status "apache"
inurl:snitz_forums_2000.mdb
inurl:ssl.conf filetype:conf
inurl:tdbin
inurl:vbstats.php "page generated"
inurl:wp-mail.php + "There doesn't seem to be any new mail."
inurl:XcCDONTS.asp
ipsec.conf
ipsec.secrets
ipsec.secrets
Lotus Domino address books
mail filetype:csv -site:gov intext:name
Microsoft Money Data Files
mt-db-pass.cgi files
MySQL tabledata dumps
mystuff.xml - Trillian data files
OWA Public Folders (direct view)
Peoples MSN contact lists
php-addressbook "This is the addressbook for *" -warning
phpinfo()
phpMyAdmin dumps
phpMyAdmin dumps
private key files (.csr)
private key files (.key)
Quicken data files
rdbqds -site:.edu -site:.mil -site:.gov
robots.txt
site:edu admin grades
site:www.mailinator.com inurl:ShowMail.do
SQL data dumps
Squid cache server reports
Unreal IRCd
WebLog Referrers
Welcome to ntop!
submitted by waysofmylife to google [link] [comments]


2013.06.14 20:52 waysofmylife (Part 1 of 14) The Power of Google - How to Search Google for Hidden Secrets - Advisories and Vulnerabilities

I use this for education, testing, and finding prospects at work. I had a few requests regarding how I find contact info to generate sales at work. I decided to post the master list of search strings that I have saved. The strings can be altered in anyway that fit your needs.
Advisories and Vulnerabilities
"1999-2004 FuseTalk Inc" -site:fusetalk.com
"2003 DUware All Rights Reserved"
"2004-2005 ReloadCMS Team."
"2005 SugarCRM Inc. All Rights Reserved" "Powered By SugarCRM"
"Active Webcam Page" inurl:8080
"Based on DoceboLMS 2.0"
"BlackBoard 1.5.1-f © 2003-4 by Yves Goergen"
"BosDates Calendar System " "powered by BosDates v3.2 by BosDev"
"Calendar programming by AppIdeas.com" filetype:php
"Copyright 2000 - 2005 Miro International Pty Ltd. All rights reserved" "Mambo is Free Software released"
"Copyright 2004 © Digital Scribe v.1.4"
"Copyright © 2002 Agustin Dondo Scripts"
"CosmoShop by Zaunz Publishing" inurl:"cgi-bin/cosmoshop/lshop.cgi"
-V8.10.106 -V8.10.100 -V.8.10.85 -V8.10.108 -V8.11*
"Cyphor (Release:" -www.cynox.ch
"delete entries" inurl:admin/delete.asp
"driven by: ASP Message Board"
"Enter ip" inurl:"php-ping.php"
"IceWarp Web Mail 5.3.0" "Powered by IceWarp"
"Ideal BB Version: 0.1" -idealbb.com
"index of" intext:fckeditor inurl:fckeditor
"inurl:/site/articles.asp?idcategory="
"Maintained with Subscribe Me 2.044.09p"+"Professional" inurl:"s.pl"
"Mimicboard2 086"+"2000 Nobutaka Makino"+"password"+"message" inurl:page=1
"News generated by Utopia News Pro" "Powered By: Utopia News Pro"
"Obtenez votre forum Aztek" -site:forum-aztek.com
"Online Store - Powered by ProductCart"
"PhpCollab . Log In" "NetOffice . Log In" (intitle:"index.of." intitle:phpcollabnetoffice
inurl:phpcollabnetoffice -gentoo)
"portailphp v1.3" inurl:"index.php?affiche" inurl:"PortailPHP" -site:safari-msi.com
"Powered *: newtelligence" ("dasBlog 1.6" "dasBlog 1.5" "dasBlog 1.4""dasBlog 1.3")
"powered by 4images"
"Powered by A-CART"
"powered by active php bookmarks" inurl:bookmarks/view_group.php?id=
"Powered by AJ-Fork v.167"
"Powered by and copyright class-1" 0.24.4
"powered by antiboard"
"Powered by autolinks pro 2.1" inurl:register.php
"Powered by AzDg" (2.1.3 2.1.2 2.1.1)
"powered by claroline" -demo
"Powered by Coppermine Photo Gallery"
"Powered by Coppermine Photo Gallery" ( "v1.2.2 b" "v1.2.1" "v1.2" "v1.1" "v1.0")
"powered by CubeCart 2.0"
"Powered by CubeCart"
"Powered by CuteNews"
"Powered by DCP-Portal v5.5"
"Powered by DMXReady Site Chassis Manager" -site:dmxready.com
"Powered by FUDForum 2.6" -site:fudforum.org -johnny.ihackstuff
"Powered by FUDForum 2.7" -site:fudforum.org -johnny.ihackstuff
"Powered by FUDforum"
"powered by Gallery v" "[slideshow]""images" inurl:gallery
"Powered by Gallery v1.4.4"
"Powered by GTChat 0.95"+"User Login"+"Remember my login information"
"powered by guestbook script" -ihackstuff -exploit
"powered by GuppY v4""Site créé avec GuppY v4"
"Powered by IceWarp Software" inurl:mail
"Powered by Ikonboard 3.1.1"
"powered by ITWorking"
"Powered by Loudblog"
"Powered by MD-Pro" "made with MD-Pro"
"Powered by Megabook *" inurl:guestbook.cgi
"Powered by MercuryBoard [v1"
"powered by minibb" -site:www.minibb.net -intext:1.7f
"Powered by My Blog" intext:"FuzzyMonkey.org"
"Powered by ocPortal" -demo -ocportal.com
"Powered by PHP Advanced Transfer Manager"
"powered by php icalendar" -ihackstuff -exploit
"powered by php photo album" inurl:"main.php?cmd=album" -demo2 -pitanje
"powered by PhpBB 2.0.15" -site:phpbb.com
"Powered By phpCOIN 1.2.2"
"powered by phplist" inurl:"lists/?p=subscribe" inurl:"lists/index.php?p=subscribe" -ubbi -bugs +phplist
-tincan.co.uk
"Powered by PowerPortal v1.3"
"powered by runcms" -runcms.com -runcms.org
"powered by sblog" +"version 0.7"
"Powered by Simplog"
"powered by sphider" -exploit -ihackstuff -www.cs.ioc.ee
"Powered by UPB" (b 1.0)(1.0 final)(Public Beta 1.0b)
"Powered by Woltlab Burning Board" -"2.3.3" -"v2.3.3" -"v2.3.2" -"2.3.2"
"Powered by WordPress" -html filetype:php -demo -wordpress.org -bugtraq
"Powered by WowBB" -site:wowbb.com
"Powered by Xaraya" "Copyright 2005"
"Powered by XHP CMS" -ihackstuff -exploit -xhp.targetit.ro
"Powered by XOOPS 2.2.3 Final"
"Powered by YaPig V0.92b"
"Powered by yappa-ng"
"Powered by Zorum 3.5"
"Powered by: Land Down Under 800" "Powered by: Land Down Under 801" - www.neocrome.net
"Powered By: lucidCMS 1.0.11"
"running: Nucleus v3.1" -.nucleuscms.org -demo
"Site powered By Limbo CMS"
"Software PBLang" 4.65 filetype:php
"SquirrelMail version 1.4.4" inurl:src ext:php
"Thank You for using WPCeasy"
"This page has been automatically generated by Plesk Server Administrator"
"This script was created by Php-ZeroNet" "Script . Php-ZeroNet"
"This website engine code is copyright" "2005 by Clever Copy" -inurl:demo
"This website powered by PHPX" -demo
"This website was created with phpWebThings 1.4"
"Welcome to the versatileBulletinBoard" "Powered by versatileBulletinBoard"
"You have not provided a survey identification number" ERROR -xoops.org "please contact"
("powered by nocc" intitle:"NOCC Webmail") -site:sourceforge.net -Zoekinalles.nl -analysis
("Skin Design by Amie of Intense")("Fanfiction Categories" "Featured Stories")("default2, 3column,
Romance, eFiction")
("This Dragonfly™ installation was" "Thanks for downloading Dragonfly") -inurl:demo -inurl:cpgnuke.com
(intitle:"Flyspray setup""powered by flyspray 0.9.7") -flyspray.rocks.cc
(intitle:"metaframe XP Login")(intitle:"metaframe Presentation server Login")
+"Powered by Invision Power Board v2.0.0..2"
+"Powered by phpBB 2.0.6..10" -phpbb.com -phpbb.pl
+intext:"powered by MyBulletinBoard"
Achievo webbased project management
allintitle:aspjar.com guestbook
E-market remote code execution
EarlyImpact Productcart
ext:php intext:"Powered by phpNewMan Version"
ext:pl inurl:cgi intitle:"FormMail " -"Referrer" -"* Denied" -sourceforge -error -cvs -input
filetype:cgi inurl:nbmember.cgi
filetype:cgi inurl:pdesk.cgi
filetype:cgi inurl:tseekdir.cgi
filetype:php intitle:"paNews v2.0b4"
filetype:php inurl:index.php inurl:"module=subjects" inurl:"func=*" (listpages viewpage listcat)
http://www.google.com/search?q=intitle:%22WEB//NEWS+Personal+Newsmanagement%22+intext:%
22%C2%A9+2002-2004+by+Christian+Scheb+-+Stylemotion.de%22%2B%22
intext:"2000-2001 The phpHeaven Team" -sourceforge
intext:"2000-2001 The phpHeaven Team" -sourceforge
intext:"Calendar Program © Copyright 1999 Matt Kruse" "Add an event"
intext:"LinPHA Version" intext:"Have fun"
intext:"PhpGedView Version" intext:"final - index" -inurl:demo
intext:"Powered by CubeCart 3.0.6" intitle:"Powered by CubeCart"
intext:"Powered by DEV web management system" -dev-wms.sourceforge.net -demo
intext:"Powered by flatnuke-2.5.3" +"Get RSS News" -demo
intext:"powered by gcards" -ihackstuff -exploit
intext:"Powered By Geeklog" -geeklog.net
intext:"Powered by phpBB 2.0.13" inurl:"cal_view_month.php"inurl:"downloads.php"
intext:"Powered by Plogger!" -plogger.org -ihackstuff -exploit
intext:"Powered by SimpleBBS v1.1"*
intext:"Powered By: Snitz Forums 2000 Version 3.4.00..03"
intext:("UBB.threads™ 6.2""UBB.threads™ 6.3") intext:"You * not logged *" -site:ubbcentral.com
intitle:"4images - Image Gallery Management System" and intext:"Powered by 4images 1.7.1"
intitle:"b2evo installer" intext:"Installer für Version"
intitle:"blog torrent upload"
intitle:"EMUMAIL - Login" "Powered by EMU Webmail"
intitle:"HelpDesk" "If you need additional help, please email helpdesk at"
intitle:"igenus webmail login"
intitle:"Looking Glass v20040427" "When verifying an URL check one of those"
intitle:"MRTG/RRD" 1.1* (inurl:mrtg.cgi inurl:14all.cgi traffic.cgi)
intitle:"myBloggie 2.1.1..2 - by myWebland"
intitle:"osTicket :: Support Ticket System"
intitle:"PHP TopSites FREE Remote Admin"
intitle:"phpDocumentor web interface"
intitle:"PowerDownload" ("PowerDownload v3.0.2 ©" "PowerDownload v3.0.3 ©" )
-site:powerscripts.org
intitle:"View Img" inurl:viewimg.php
intitle:"WebJeff - FileManager" intext:"login" intext:PassPAsse
intitle:"WordPress > * > Login form" inurl:"wp-login.php"
intitle:admbook intitle:version filetype:php
intitle:guestbook "advanced guestbook 2.2 powered"
intitle:guestbook inurl:guestbook "powered by Advanced guestbook 2.*" "Sign the Guestbook"
intitle:guestbook inurl:guestbook "powered by Advanced guestbook 2.*" "Sign the Guestbook"
intitle:Mantis "Welcome to the bugtracker" "0.15 0.16 0.17 0.18"
intitle:PHPOpenChat inurl:"index.php?language="
intitle:welcome.to.horde
inurl:"/cgi-bin/loadpage.cgi?user_id="
inurl:"/login.asp?folder=" "Powered by: i-Gallery 3.3"
inurl:"/site/articles.asp?idcategory="
inurl:"comment.php?serendipity"
inurl:"extras/update.php" intext:mysql.php -display
inurl:"forumdisplay.php" +"Powered by: vBulletin Version 3.0.0..4"
inurl:"messageboard/Forum.asp?"
inurl:"slxweb.dll"
inurl:"wfdownloads/viewcat.php?list="
inurl:.exe ext:exe inurl:/cgi*/
inurl:/SiteChassisManage
inurl:cal_make.pl
inurl:chitchat.php "choose graphic"
inurl:citrix/metaframexp/default/login.asp? ClientDetection=On
inurl:comersus_message.asp
inurl:course/category.php inurl:course/info.php inurl:iplookup/ipatlas/plot.php
inurl:database.php inurl:info_db.php ext:php "Database V2.*" "Burning Board *"
inurl:directorypro.cgi
inurl:docmgr intitle:"DocMGR" "enter your Username and""und Passwort bitte""saisir votre nom""su
nombre de usuario" -ext:pdf -inurl:"download.php
inurl:gotoURL.asp?url=
inurl:index.php fees shop link.codes merchantAccount
inurl:install.pl intitle:GTchat
inurl:perldiver.cgi ext:cgi
inurl:resetcore.php ext:php
inurl:server.php ext:php intext:"No SQL" -Released
inurl:sphpblog intext:"Powered by Simple PHP Blog 0.4.0"
inurl:sysinfo.cgi ext:cgi
inurl:technote inurl:main.cgifilename=
inurl:tmssql.php ext:php mssql pear adodb -cvs -akbk
inurl:ttt-webmaster.php
inurl:wiki/MediaWiki
Invision Power Board SSI.PHP SQL Injection
mnGoSearch vulnerability
phpLDAPadmin intitle:phpLDAPadmin filetype:php inurl:tree.php inurl:login.php inurl:donate.php (0.9.6 0.9.7)
Powered by PHP-Fusion v6.00.109 © 2003-2005. -php-fusion.co.uk
powered.by.instaBoard.version.1.3
Powered.by:.vBulletin.Version ...3.0.6
Quicksite demopages for Typo3
ReMOSitory module for Mambo
uploadpics.php?did= -forumintext:Generated.by.phpix.1.0? inurl:$mode=album
vBulletin version 3.0.1 newreply.php XSS
VP-ASP Shopping Cart XSS
WEBalbum 2004-2006 duda -ihackstuff -exploit
WebAPP directory traversal
submitted by waysofmylife to google [link] [comments]


2012.09.13 16:08 td888 Giraffe cam

Giraffe cam.
Side note: found via this text "プリセット inurl:”ViewerFrame?Mode=" in Google search
submitted by td888 to controllablewebcams [link] [comments]


2011.04.30 12:01 relic2279 Look for (and post) some unsecured cameras yourself

To find some unsecured cameras on your own, go to google and search for these terms (Be aware that blogspammers also use some of these terms) -- Make sure your google-fu is strong.
  1. inurl:”ViewerFrame?Mode=
  2. intitle:Axis 2400 video server
  3. inurl:/view.shtml
  4. intitle:”Live View / – AXIS” inurl:view/view.shtml^
  5. inurl:ViewerFrame?Mode=
  6. inurl:ViewerFrame?Mode=Refresh
  7. inurl:axis-cgi/jpg
  8. inurl:axis-cgi/mjpg (motion-JPEG)
  9. inurl:view/indexFrame.shtml
    1. inurl:view/index.shtml
    2. inurl:view/view.shtml
    3. liveapplet
    4. intitle:”live view” intitle:axis
    5. intitle:liveapplet
    6. allintitle:”Network Camera NetworkCamera”
    7. intitle:axis intitle:”video server”
    8. intitle:”EvoCam” inurl:”webcam.html”
    9. intitle:”Live NetSnap Cam-Server feed”
    10. intitle:”Live View / – AXIS”
    11. intitle:”Live View / – AXIS 206M”
    12. intitle:”Live View / – AXIS 206W”
    13. intitle:”Live View / – AXIS 210″
    14. inurl:indexFrame.shtml Axis
    15. inurl:”MultiCameraFrame?Mode=Motion”
    16. intitle:start inurl:cgistart
    17. intitle:”WJ-NT104 Main Page”
    18. intext:”MOBOTIX M1″ intext:”Open Menu”
    19. intext:”MOBOTIX M10″ intext:”Open Menu”
    20. intext:”MOBOTIX D10″ intext:”Open Menu”
    21. intitle:snc-z20 inurl:home/
    22. intitle:snc-cs3 inurl:home/
    23. intitle:snc-rz30 inurl:home/
    24. intitle:”sony network camera snc-p1″
    25. intitle:”sony network camera snc-m1″
    26. site:.viewnetcam.com -www.viewnetcam.com
    27. intitle:”Toshiba Network Camera” user login
    28. intitle:”netcam live image”
    29. intitle:”i-Catcher Console – Web Monitor”
submitted by relic2279 to controllablewebcams [link] [comments]


http://swiebodzin.info