Movieplayer.it (Italian scraper)
#1
Hi all, i'm writing this italian scraper;
it's 70% finished but now i get some problem with custom functions.

Is there anyone can help me?? we have only 2 simple custom functions for cast and writer.
Is there any way to test custom functions in Nicezia's xml scraper editor?
Please explain me cleanly what i done wrong on custom function so i can finish scraper soon as possible and create a ticket to svn!! Nod

PS: Regular expressions match text perfectly (tested with Nicezia scraper editor)

CODE on pastebin


Bye all and tnxs in advance
Reply
#2
Plz help!
Reply
#3
If you really need help, pastebin or trac what you got so far. Do NOT use rapidshare or any other online hoster for plain text files ..
Always read the online manual (wiki), FAQ (wiki) and search the forum before posting.
Do not PM or e-mail Team-Kodi members directly asking for support. Read/follow the forum rules (wiki).
Please read the pages on troubleshooting (wiki) and bug reporting (wiki) before reporting issues.
Reply
#4
vdrfan Wrote:If you really need help, pastebin or trac what you got so far. Do NOT use rapidshare or any other online hoster for plain text files ..
Hey sorry, my pity!!

pastebin

Thanks
Reply
#5
Only did a quick test, but those two functions are working for me. On the other hand i mentioned some nasty html output in debug log.

What exactly fails for you using the scraper in XBMC?
Always read the online manual (wiki), FAQ (wiki) and search the forum before posting.
Do not PM or e-mail Team-Kodi members directly asking for support. Read/follow the forum rules (wiki).
Please read the pages on troubleshooting (wiki) and bug reporting (wiki) before reporting issues.
Reply
#6
Ok , now it's working;
is there any way to cache external page wthout call custom function??
For fanart the sintax <fanart>url.jpg</fanart> is ok?

thanks in advance
Reply
#7
wherever you return an url you can cache it. not sure i understand your question though.

no, that is not the syntax;

Code:
<fanart>
  <thumb>..</thumb>
  <thumb>..</thumb>
</fanart>
Reply
#8
spiff Wrote:wherever you return an url you can cache it. not sure i understand your question though.

no, that is not the syntax;

Code:
<fanart>
  <thumb>..</thumb>
  <thumb>..</thumb>
</fanart>

Ok for fanart sintax;

For caching i mean how to cache the source code of a page in get details
section without call custom function.

Like putting a source code of externa page (not returned by get search result) in imput of regex of get details section

Smile bye
Reply
#9
aha.

you can nest several url's in getsearchresults

<url>url1</url><url>url2</url>

will fetch the two url's and stick them in $$1 and $$2 on the call to getdetails.
Reply

Logout Mark Read Team Forum Stats Members Help
Movieplayer.it (Italian scraper)0