Are: How to download a gif from giffycat
Download sc1 album | 408 |
Mysql query browser 1.2 17 download | 436 |
How to stop windows 10 aniversery from downloading updates | 922 |
Eroge torrent download | 609 |
Remote play download ps4 | 255 |
sbaks0820 / gfycat-download
Given only a Gfycat URL, it is difficult to programatically get the URL for the GIF version of it. Therefore repo has a few helper functions that interact with the Gfycat API in order to download the GIF version of any Gfycat.
Note: Before using this script, you need a and from Gfycat that can be obtained here.
My incentive
The reason I created this is that the tools available to sort/filter your saved posts on Reddit are very limited. You have to pay for Reddit premium to be able to sort by subreddit. Therefore, I figured I'd just store them locally so that I can sort however I want. To do this I first use . This website requests you to login to your account and it finds and sorts all of your saved posts for you. You can then export this to an file which you can then parse and download. There are toosl for downloading Imgure images and albums, but nothing for Gfycat. Although this requires dev access from Gfycat, it's free and very easy to get.
How to Use It
Save these two parameters in a json file called .
The first step to downloading is getting an from Gfycat that you must have in the header of every request you make after obtaining it. Once you have the access token you can obtain the access token:
This token is what you'll use for the current session of API access to be able to use it. It usually expires in 3600s seconds (1 hour). Once you have the you're ready to start downloading GIFs!
As an example, say you want to download the Gfycat at the following URL:
The Gfycat ID for this link is . Although the URL has more words in the string, this is the unique ID assigned to thie Gfy. You'll use this ID to tell the API which Gfy you want to download. For ease of programing there's a function that'll get the from a URL:
Now that you have the ID, you can call:
The function returns a json object that's the response body from the request. This may seem no abstracted enough for a toolkit like this repo, but the raw response contains all of the URLs that may be of interest.
The only function available at the moment, just grabs the URL for the largest size from the response json object:
If you go to the above URL, you'll get what you want. If you want to grab some other URL, there are a few options:
- : this is the one that's implemented right now, it grabs the largest size
- : as the name implies, maximum size of 1mb
Contributing
If you want to change the code to download one of the other sizes, just make the following change to the function.
Just change to the one you want, or just create a new function (the better way to do it).
-