2013-03-13, 13:06
Hi Boss,
Since upgrading to Frodo - and therefore downloading a fresh copy of your script - an error I pointed out in (post 137) is back. I needed to add 'www' to the searchURL:
at line 580 of your code.
This might be a larger problem for XBMC, but when using a proxy (which I am), the code behind calls such as:
does not resubmit the query to redirected URLs (when the initial method is POST). At least that's my assumption.
I reconstructed your HTTP POST with cURL. Without a proxy it works:
With a proxy, and WITH 'www' prepended to the searchURL it works:
However, with the original URL and with a proxy, it fails:
The man page for cURL explains it thusly:
There's the issue, at least with cURL. The original POST becomes a GET. I'd bet that the same code is in use behind the scenes in XBMC. It'd be great if an XBMC developer could take a look.
Anyway, I guess I'm posting this for two reasons:
1) Is there any reason not to prefix the searchURL with 'www' since you're being redirected there anyway?
2) In case anyone else is having the same issue.
Thanks,
Scott
Since upgrading to Frodo - and therefore downloading a fresh copy of your script - an error I pointed out in (post 137) is back. I needed to add 'www' to the searchURL:
Code:
searchURL = 'http://www.weatherzone.com.au/search/'
This might be a larger problem for XBMC, but when using a proxy (which I am), the code behind calls such as:
Code:
urllib2.urlopen(req)
I reconstructed your HTTP POST with cURL. Without a proxy it works:
Code:
[root@dev-01 ~]# curl -s -q -0 -k -d q=3122 -d t=3 --header "Host: www.weatherzone.com.au" --header "User-Agent: Mozilla/4.0 (compatible; MSIE 5.5; Windows NT" http://weatherzone.com.au/search/ | grep Hawthorn
<li><a href="/vic/melbourne/hawthorn">Hawthorn, VIC 3122</a></li>
<li><a href="/vic/melbourne/hawthorn-north">Hawthorn North, VIC 3122</a></li>
<li><a href="/vic/melbourne/hawthorn-west">Hawthorn West, VIC 3122</a></li>
With a proxy, and WITH 'www' prepended to the searchURL it works:
Code:
[root@dev-01 ~]# curl --location --proxy http://proxy.thismonkey.com:3128 -s -q -0 -k -d q=3122 -d t=3 --header "Host: www.weatherzone.com.au" --header "User-Agent: Mozilla/4.0 (compatible; MSIE 5.5; Windows NT" http://www.weatherzone.com.au/search/ | grep Hawthorn
<li><a href="/vic/melbourne/hawthorn">Hawthorn, VIC 3122</a></li>
<li><a href="/vic/melbourne/hawthorn-north">Hawthorn North, VIC 3122</a></li>
<li><a href="/vic/melbourne/hawthorn-west">Hawthorn West, VIC 3122</a></li>
However, with the original URL and with a proxy, it fails:
Code:
[root@dev-01 ~]# curl --location --proxy http://proxy.thismonkey.com:3128 -s -q -0 -k -d q=3122 -d t=3 --header "Host: www.weatherzone.com.au" --header "User-Agent: Mozilla/4.0 (compatible; MSIE 5.5; Windows NT" http://weatherzone.com.au/search/ | grep Hawthorn
<no output>
The man page for cURL explains it thusly:
Quote:...When curl follows a redirect and the request is not a plain GET
(for example POST or PUT), it will do the following request with
a GET if the HTTP response was 301, 302, or 303...
There's the issue, at least with cURL. The original POST becomes a GET. I'd bet that the same code is in use behind the scenes in XBMC. It'd be great if an XBMC developer could take a look.
Anyway, I guess I'm posting this for two reasons:
1) Is there any reason not to prefix the searchURL with 'www' since you're being redirected there anyway?
2) In case anyone else is having the same issue.
Thanks,
Scott