Check out anisq for a similar implementation
- ffmpeg
- curl
- vlc
- yt-dlp
git clone https://github.com/deniscerri/filma24-cli
cd filma24-cli && chmod +x f24
sudo cp f24 /usr/local/bin/f24
- Download Termux
- Give storage permissions with the command:
termux-setup-storage
git clone https://github.com/deniscerri/filma24-cli
cd filma24-cli && chmod +x f24
cp f24 $PREFIX/bin/f24
- (DISCLAIMER. VLC on Android doesn't support Referrers so there is no way to play on Android for now)
Either install WSL (Windows Subsystem for Linux) or Git Bash.
- If you choose WSL, you can use the same commands as generic linux. (If you are running WSL1, you can't open VLC so you can only download with it. You need to upgrade to WSL2 or use Git Bash)
- On Git Bash you need to keep to copy f24 to /usr/bin/ directory instead and you need to run it as an administrator to achieve that.
f24 -m [Movie Title or URL]
-
f24 -im [Movie Title]
-
f24 -wm [Movie Title]
-
f24 -wim [Movie Title]
f24 -t [Series Title or URL]
-
f24 -it [Series Title]
-
f24 -wt [Series Title]
-
f24 -wt [Series Title] -s [Season Nr]
-
f24 -wt [Series Title] -s [Season Nr] -e [Episode Nr]
f24 [options] [query] -o [directory path]
You can use a txt file filled with names or url's and use that as input. The script will go over all of them.
- If you add a custom season, the script will download the same season number for all elements on the list. Same thing will happen for a custom episode.
-h help this page
-m sets media type as movie
-t sets media type as tv series
-w watch instead of downloading
-s set a particular season to download. By default it downloads all seasons
-e set a particular episode to download. By default it downloads all episodes
-o set a custom download path. By default it downloads in your current working directory
-i use interactive mode when searching, instead of the script picking it itself
-u update the script
This script was made to make it easier to download movies and tv series through web scraping instead of doing so manually. Every content that is being downloaded is hosted by third parties, as mentioned in the webpage that is used for scraping.
Use it at your own risk. Make sure to look up your country's laws before proceeding.
Any Copyright Infridgement should be directed towards the scraped website or hosts inside it.