: How to list all URLs in the source code files of a website with command line tools? I have folder that contains the source code files of a website (tons of html, css and php files). I would
I have folder that contains the source code files of a website (tons of html, css and php files). I would like to find an automatic method (preferably with command line tools) to scan this folder recursively and get a list of URLs that are included in these files.
I guess it's possible with find and grep and some regular expressions, but my knowledge is limited about combining these commands.
Basically it'd be a good way to find out if there are any "hidden" URL calls (data leaks, backdoors or secret download functions) in the code of the CME or installed plug-ins.
More posts by @Turnbaugh106
2 Comments
Sorted by latest first Latest Oldest Best
Use grep command and search recursively for http in a documment.
Syntax
(If I can remember well)
grep -r http ./var/html/index.php
Where grep -r will force a recursive search, altought you can use other methods too, type grep --help to see them.
http is the string you are searching for, and /var/html/index.php is the directory where the file is located.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.