How use buffer from <CreateSearchUrl> in <GetDetails>
#1
Code:
.
.
.
<CreateSearchUrl clearbuffers="no" dest="3">

        <RegExp input="$$1" output="\2" dest="14">
            <expression clear="yes" noclean="2" />
        </RegExp>

</CreateSearchUrl>
.
.
.
<GetDetails clearbuffers="no" dest="3">

           <RegExp input="$$14" output="&lt;tag&gt;\1&lt;/tag&gt;" dest="5+">
                <expression clear="yes" noclean="1">(.*)(480p|720p|1080p)</expression>
            </RegExp>

</GetDetails>
.
.
.

Why this does not work? Buffer $$14 is empty.
Reply
#2
because you fill it with an empty string.. mind your output.
Reply
#3
When I use $$14 in section <CreateSearchUrl> it is not empty.
Reply
#4
Do you also have clearbuffers="no" on GetSearchResults?
Reply
#5
Yes
Reply
#6
i don't believe you. you have an empty expression which is equivalent to select all. then you output a nonexistent,. second selection.
Reply
#7
Second selection here does not matter, try this.

Add this in section CreateSearchUrl :
Code:
<RegExp input="$$1" output="\1" dest="14">
        <expression noclean="1" />
</RegExp>
to a scraper, can be a Universal Movie Scraper.

And add this in section GetDetails:
Code:
<RegExp input="$$14" output="&lt;tag&gt;\1&lt;/tag&gt;" dest="5+">
        <expression noclean="1" />
</RegExp>

Fill the missing clearbuffers="no".

And now, work this on Your xbmc?

On my XBMC 12.2 does not work and on my OpenElec 3.1.7 also does not work. Buffor $$14 is empty.

What am I doing wrong?
Reply
#8
Looks like you're right. The buffer carries into GetSearchResults, but not into GetDetails.

What you could do is use the id tag in the search results to pass the contents of the buffer forward (I'm currently testing something similar for the AniDB mod scraper...)

To take the Universal scraper as an example, basically you'd replace <id>\1</id> in the output for each <entity> with something like <id>\1|$$14</id> (obviously the "\1" might be something different depending on the regexp).

Then in GetDetails, you have access to the <id> value in buffer $$2, so you straight away separate it back out in case the GetDetails needs to use the original value anywhere (I don't think Universal actually does, but it does fill $$2 fairly early, so you'll still want to grab what you want first):
Code:
<RegExp input="$$2" output="\1" dest="14">
    <expression>\|(.*)</expression>
</RegExp>
<RegExp input="$$2" output="\1" dest="2">
    <expression>(.*)\|</expression>
</RegExp>
Reply
#9
i am sure your issue is real, but you can't give some random code that will not behave and expect us to understand it's just some random broken code. give the exact code that breaks.

you are correct that information does not carry over from createsearchurl (or rather, from getsearchresults) and further on. there is no guarantee this part of the scraper is called - in particular url nfo's bypass this step completely. thus, any scraper relying on this would break -> i disabled passing info from one part of the process (search and parse list) to the other (scrape a particular movie/show/episode). the only way to pass such info is to embed it in the results. e.g. each result can do

Code:
<url>foo</url><url>bar</url>
that is; every result can depend on multiple urls (we cache so don't worry about refetching a file several times - server won't see it). the contents of these are then available as $1. $2 in the getdetails calls. any number of urls per result is fine (well, there's a limit of 20 since we run out of scraper buffers..)
Reply
#10
All what I need is the original contents of the buffer $$1 from section CreateSearchUrl in any buffer in section GetDetails.
Reply
#11
then you need to push that url as one of the urls for each result. you can transfer between createsearchurl and getsearchresults.
Reply

Logout Mark Read Team Forum Stats Members Help
How use buffer from <CreateSearchUrl> in <GetDetails>0