Like Ra in latex catsuit, latex mask and high heels
Like Ra's Naughty Playground

"mirabelle" lingerie
MIRABELLE Floral Sexy Erotic Lingerie Woman 2 Piece Set Transparent Embroidery Fancy Outfit Romantic Super Hot Lace Bilizna Set
$105.05-50%

"leohex"
LEOHEX 2022 swimwear women sexy swimsuit transparent bathing swimming suit for One Piece bodysuit shiny high cut Swimwear
$70.16-50%

"gaff" men
Sand Bag Vacation T Shirt Cotton Men Women DIY Print Gaff Film Gaff Gaffers Tape Gaffer Tape Tape Roll Film Tape Color Film
$93.17-5%

"leohex"
LEOHEX 2022 Sexy Women Swimwear Japanese Sexy High Cut Monokini One Piece Long sleeve Female Bather Bathing Summer Suit Swim
$70.16-50%

"amoresy"
AMORESY Women's Shiny Smooth Competition Swimsuit Tights Stretch One Piece Bodysuits Glitter Patchwork Rompers Surfing Jumpsuit
$356.95-10%

"corset"
Summer Tube Top Corset Dress Black Sexy Slit Dress Female Temperament Commuter Long Skirt European And American New Products
$47.18-40%

"Yimunancy" lingerie
Yimunancy Heart Flocked Lingerie Set Women Transparent Bra + Brief Intimates Transparent Underwear Set
$92.06-52%



To view Aliexpress you might need to switch your mobile browser to the Desktop version.


These metal handcuffs cannot be opened without a key
These metal handcuffs cannot be opened without a key
€23.50

If you would like to use search functions, view hidden archives or play games, please consider registering.


Tumblr is virtually dead
#21
It's a "standard" UNIX program/library, called "curl", not something written by me.
https://en.wikipedia.org/wiki/CURL
Reply
#22
before the "purge", I was using their old v1 API to download new images from a list of tumblrs...
it still seems to work, even without authentication...

so basically this call to get a list of photo-posts:
curl -s "${TUMBLR}/api/read?type=photo&num=${MAX_POSTS}&start=0"
(with $TUMBLR being the blog's URL)

and then grep for 'photo-url', a bit sed-magic and wget...
If you want, I can post the script...
Reply
#23
May not be a bad idea. At lease if it’s posted here, it can’t get deleted unless Ra kills it, and we can reference it when we need it.

Does that make sense?
Reply
#24
aight, here it is...

you pass it the URL of the blog you want to download. It will create directories for the images and for status-files (they store the id of the latest downloaded post, so subsequent runs won't re-download images)
You'll probably need to adapt the WORKDIR variable...

Code:
#!/bin/bash

TUMBLR="$1"
DIRNAME=$(sed -e "s/[^[:alnum:]]/_/g" <<<$TUMBLR)
WORKDIR=~/workspace/tumblr
IMAGE_DIR="images/$DIRNAME"
MAX_POSTS=100

cd "$WORKDIR"
mkdir -p status "$IMAGE_DIR"

STATFILE="status/$DIRNAME"

test -e "$STATFILE" && LAST_POST=$(cat "$STATFILE")
LAST_POST=${LAST_POST:-0}
NEWEST_POST=$LAST_POST

getPost(){
       MAX_SIZE=0
       PHOTO=""

       #iterate over all photo-urls and choose only the biggest pic
       for PHOTO_URL in $(sed -e "s/<photo-url/\n<photo-url/g" <<<$1| grep "<photo-url "); do
               PHOTO_TMP=$(echo "$PHOTO_URL" | sed -e "s/.*>\(.*\)<\/photo-url.*/\1/")
               SIZE=$(echo "$PHOTO_URL" | sed -e "s/.*max-width=\"\([[:digit:]]\+\)\".*/\1/")
               
               if [ $SIZE -gt $MAX_SIZE ]; then
                       MAX_SIZE=$SIZE
                       PHOTO=$PHOTO_TMP
               fi
       done

       #wait until no more than 5 wget processes are running
       while [ $(pgrep -c wget) -gt 5 ]; do
               sleep 1s
       done

       if [ "$PHOTO" ]; then
               echo "downloading $PHOTO"
               wget -P "$IMAGE_DIR" -nc -b "$PHOTO"
       fi
}


POSTS=$(curl -s "$TUMBLR/api/read?type=photo&num=$MAX_POSTS&start=0" | sed -e "s/<post/\n<post/g" | grep "<post ")
if [ -z "$POSTS" ]; then
       TUMBLR=$(sed -e "s/http:/https:/" <<<$TUMBLR)
       POSTS=$(curl -s "$TUMBLR/api/read?type=photo&num=$MAX_POSTS&start=0" | sed -e "s/<post/\n<post/g" | grep "<post ")
fi

echo "checking $TUMBLR..."

IFS='
'
for POST in $POSTS; do
       NUMBER=$(sed -e "s/<post\ id=\"\([[:digit:]]\+\)\".*/\1/" <<<$POST)
       
       if [ $NUMBER -gt $LAST_POST ]; then
               getPost "$POST"
               if [ $NUMBER -gt $NEWEST_POST ]; then
                       NEWEST_POST=$NUMBER
               fi
       fi
done

echo $NEWEST_POST > $STATFILE

Reply
#25
Good one! Thanks! The way do rip the entire tumblr blogs! (Unless there's an API limitation).

But for the automatic update I need to get access to the RSS. Or ... create a new wordpress plugin, what would require much more time 😁

Need to check the API...
Reply
#26
(01 Mar 2019, 16:28 )occorics Wrote:
Code:
while [ $(pgrep -c wget) -gt 5 ]; do
    sleep 1s
done


I take it you run several scripts in parallel, right? Otherwise (for one blog only) I would add an '&' at the end of the 'wget' line:

Code:
wget -P "$IMAGE_DIR" -nc -b "$PHOTO"&

Reply
#27
Can’t you take an existing Wordpress Plugin and adapt it suit the needs rather than creating a whole new plugin?
Reply
#28
(02 Mar 2019, 02:13 )Tinker D Wrote: Can’t you take an existing Wordpress Plugin and adapt it
That's what I usually do - tinkering 😁

The problem is - there are no similar plugins around. The one I use for the RSS feeds was deleted by the author from everywhere and is, obviously, not supported. It's quite complex and buggy, but it works so far. The most recent version has some bugs fixed, but the caching is removed, what catastrophically degraded the performance, so I use one of the versions, that, actually, needs to be rewritten.

The bottom line is - currently I tweak only what can be easily tweaked without digging into the details, and what will bring the maximum benefit. (BTW, I'm not even a programmer 😁 )
Reply
#29
There is no access to the RSS feed in either APIv1 or APIv2, but what might work is the following:

Program 1:
o- Grab the last x posts in the XML format
o- Convert the XML into the RSS XML
o- Store the results in memcache, indexed by the site name

Program 2:
o- Serve HTTP requests from the locallost
o- Return memcache entry according to the site name in the query

Then configure the RSS feed grabber (the current WP plugin) to use the Program 2 URL

The problem #1 - figure out how to convert the Tumblr XML response to an RSS feed.
Reply
#30
(01 Mar 2019, 17:54 )Like Ra Wrote: Good one! Thanks! The way do rip the entire tumblr blogs! (Unless there's an API limitation).

But for the automatic update I need to get access to the RSS. Or ... create a new wordpress plugin, what would require much more time 😁

Need to check the API...

It will only rip everything new. That's what the STATFILE is for. It stores the id of the latest downloaded post and the script stops there on the next run. For the first run, it's limited by MAX_POSTS
Reply




Contributors: brothermarcus (1) , essanym (2) , HeatherHather (1) , krinlyc (1) , Like Ra (18) , madjack (5) , occorics (8) , princesitanatty (1) , Samantha Velvet (1) , therealjesse92 (1) , Tinker D (4) , vanessa_fetish (1)