Like Ra in latex catsuit, latex mask and high heels
Like Ra's Naughty Playground

speerise
Speerise Women Metallic Shorts For Adults Ballet Performance Dance Bottoms Basic Booty Shorts Fitness Underpants Girls
$100.42-25%

sexy pantyhose
Plus Size Fishnet Bottomed Pantyhose Women Professional Latin Tights Sexy Mesh Transparent Stockings For Ballroom & Latin Dance
$2.93-3%

speerise
Adult Ballet Dance Nylon Tank Unitard Black Gymnastics Yoga Ballerina Gym Bodysuits Women Scoop Neck Long Sleeve Stage Costumes
$197.23-32%

vac-u-lock machine
Rough Beast Vac-u-Lock Horse dildo for Sex Machine Silicone/VAC Penis Big Anal plug Adult Masturbator Clit Stimulate Sex Toys
$307.23-32%

silicone bodysuit
BCDEG Cup Silicone Breast Forms For Crossdresser Artificial Fake Boobs Plate Bodysuit Transvestite Shemale Transgender Cosplay
$358.10-1%

larpgears
Wholesale new fashion moulded female latex clothing set include bra and shorts rubber lingerie set
$34.98-25%

kigurumi mask crossdress
(DOLLKII-A2) Quality Handmade Female Girl Resin Half Head Cosplay Japanese Role Play BJD Kigurumi Mask Crossdresser Doll Mask
$844.58-27%



To view Aliexpress you might need to switch your mobile browser to the Desktop version.


Spanking machine
Spanking machine
€159.95

If you would like to use search functions, view hidden archives or play games, please consider registering.


Tumblr is virtually dead
#21
It's a "standard" UNIX program/library, called "curl", not something written by me.
https://en.wikipedia.org/wiki/CURL
Reply
#22
before the "purge", I was using their old v1 API to download new images from a list of tumblrs...
it still seems to work, even without authentication...

so basically this call to get a list of photo-posts:
curl -s "${TUMBLR}/api/read?type=photo&num=${MAX_POSTS}&start=0"
(with $TUMBLR being the blog's URL)

and then grep for 'photo-url', a bit sed-magic and wget...
If you want, I can post the script...
Reply
#23
May not be a bad idea. At lease if it’s posted here, it can’t get deleted unless Ra kills it, and we can reference it when we need it.

Does that make sense?
Reply
#24
aight, here it is...

you pass it the URL of the blog you want to download. It will create directories for the images and for status-files (they store the id of the latest downloaded post, so subsequent runs won't re-download images)
You'll probably need to adapt the WORKDIR variable...

Code:
#!/bin/bash

TUMBLR="$1"
DIRNAME=$(sed -e "s/[^[:alnum:]]/_/g" <<<$TUMBLR)
WORKDIR=~/workspace/tumblr
IMAGE_DIR="images/$DIRNAME"
MAX_POSTS=100

cd "$WORKDIR"
mkdir -p status "$IMAGE_DIR"

STATFILE="status/$DIRNAME"

test -e "$STATFILE" && LAST_POST=$(cat "$STATFILE")
LAST_POST=${LAST_POST:-0}
NEWEST_POST=$LAST_POST

getPost(){
       MAX_SIZE=0
       PHOTO=""

       #iterate over all photo-urls and choose only the biggest pic
       for PHOTO_URL in $(sed -e "s/<photo-url/\n<photo-url/g" <<<$1| grep "<photo-url "); do
               PHOTO_TMP=$(echo "$PHOTO_URL" | sed -e "s/.*>\(.*\)<\/photo-url.*/\1/")
               SIZE=$(echo "$PHOTO_URL" | sed -e "s/.*max-width=\"\([[:digit:]]\+\)\".*/\1/")
               
               if [ $SIZE -gt $MAX_SIZE ]; then
                       MAX_SIZE=$SIZE
                       PHOTO=$PHOTO_TMP
               fi
       done

       #wait until no more than 5 wget processes are running
       while [ $(pgrep -c wget) -gt 5 ]; do
               sleep 1s
       done

       if [ "$PHOTO" ]; then
               echo "downloading $PHOTO"
               wget -P "$IMAGE_DIR" -nc -b "$PHOTO"
       fi
}


POSTS=$(curl -s "$TUMBLR/api/read?type=photo&num=$MAX_POSTS&start=0" | sed -e "s/<post/\n<post/g" | grep "<post ")
if [ -z "$POSTS" ]; then
       TUMBLR=$(sed -e "s/http:/https:/" <<<$TUMBLR)
       POSTS=$(curl -s "$TUMBLR/api/read?type=photo&num=$MAX_POSTS&start=0" | sed -e "s/<post/\n<post/g" | grep "<post ")
fi

echo "checking $TUMBLR..."

IFS='
'
for POST in $POSTS; do
       NUMBER=$(sed -e "s/<post\ id=\"\([[:digit:]]\+\)\".*/\1/" <<<$POST)
       
       if [ $NUMBER -gt $LAST_POST ]; then
               getPost "$POST"
               if [ $NUMBER -gt $NEWEST_POST ]; then
                       NEWEST_POST=$NUMBER
               fi
       fi
done

echo $NEWEST_POST > $STATFILE
Reply
#25
Good one! Thanks! The way do rip the entire tumblr blogs! (Unless there's an API limitation).

But for the automatic update I need to get access to the RSS. Or ... create a new wordpress plugin, what would require much more time 😁

Need to check the API...
Reply
#26
(01 Mar 2019, 16:28 )occorics Wrote:
Code:
while [ $(pgrep -c wget) -gt 5 ]; do
    sleep 1s
done

I take it you run several scripts in parallel, right? Otherwise (for one blog only) I would add an '&' at the end of the 'wget' line:

Code:
wget -P "$IMAGE_DIR" -nc -b "$PHOTO"&
Reply
#27
Can’t you take an existing Wordpress Plugin and adapt it suit the needs rather than creating a whole new plugin?
Reply
#28
(02 Mar 2019, 02:13 )Tinker D Wrote: Can’t you take an existing Wordpress Plugin and adapt it
That's what I usually do - tinkering 😁

The problem is - there are no similar plugins around. The one I use for the RSS feeds was deleted by the author from everywhere and is, obviously, not supported. It's quite complex and buggy, but it works so far. The most recent version has some bugs fixed, but the caching is removed, what catastrophically degraded the performance, so I use one of the versions, that, actually, needs to be rewritten.

The bottom line is - currently I tweak only what can be easily tweaked without digging into the details, and what will bring the maximum benefit. (BTW, I'm not even a programmer 😁 )
Reply
#29
There is no access to the RSS feed in either APIv1 or APIv2, but what might work is the following:

Program 1:
o- Grab the last x posts in the XML format
o- Convert the XML into the RSS XML
o- Store the results in memcache, indexed by the site name

Program 2:
o- Serve HTTP requests from the locallost
o- Return memcache entry according to the site name in the query

Then configure the RSS feed grabber (the current WP plugin) to use the Program 2 URL

The problem #1 - figure out how to convert the Tumblr XML response to an RSS feed.
Reply
#30
(01 Mar 2019, 17:54 )Like Ra Wrote: Good one! Thanks! The way do rip the entire tumblr blogs! (Unless there's an API limitation).

But for the automatic update I need to get access to the RSS. Or ... create a new wordpress plugin, what would require much more time 😁

Need to check the API...

It will only rip everything new. That's what the STATFILE is for. It stores the id of the latest downloaded post and the script stops there on the next run. For the first run, it's limited by MAX_POSTS
Reply




Contributors: brothermarcus (1) , essanym (2) , krinlyc (1) , Like Ra (18) , madjack (3) , occorics (8) , Tinker D (4)