Tutorials | Challenges | Tools | Downloads | Resources | Documentaries


Having many word lists can be a pain in the ass for two main reasons:
1) It's time-consuming to run each word list one-by-one.
2) many of the word lists that we have tend to contain duplicates.

However, we can merge all our lists together and remove the duplicates.

Note: This doesn't delete both words that match, it leaves 1 remaining.

Let's go to the location of our word lists and use cat to display the contents.

[Image: 1.png]

 As you can see, there are two lists of emails, and both lists contain some of the same email addresses.

Lets merge the two lists and leave the merged list containing no duplicates.
Type: cat list1.txt list2.txt | sort -u > newlist.txt

When checking the contents of the new lists, we see that it contains no duplicates.

[Image: 2.png]

If our newlist.txt file contains thousands/millions of words, using cat to check the results is not a good idea. If we do use cat, we'll spend forever checking the results, and you may DOS or freeze your system.

Using pipal is a better idea.

Type pipal newlist.txt

We can see that all the words are unique.

[Image: 3.png]

If your lists contain empty lines, it's best to remove them to avoid cracking/hacking errors.

The quickest way to do this is to visit: http://textmechanic.com/text-tools/basic...pty-lines/
then paste in your text and click 'Remove Empty Lines'.

Post a Comment



{twitter https://twitter.com/ghhackers}

Contact Form


Email *

Message *

Powered by Blogger.
Javascript DisablePlease Enable Javascript To See All Widget