Kodi Community Forum

Full Version: [RELEASE] OneDDL.com Plugin
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2 3
hd-area.org would be awesome Smile
I also would love to make/have a plugin for hd-area.org, but the problem is that they are using share-links.biz to protect their links. JDownloader can't add the links automatically and entering the captcha within xbmc isn't yet possible.

A new xbmc plugin would only work with links, that JDownloader can add directly without human interaction.
Ok, so what about ddl-warez.in
I don't know this site. I'll have a look into it next week.
This is an excellent plugin for an equally excellent site! New search feature works great!

In addition to the 'Prefer HD Links' option, could we also have a 'Disable HD Links' option, for us bandwidth limited users? I know the download names make it kinda obvious, but not everyone using my HTPC can tell the difference Smile
Likewise on occasion, the preferred hoaster selected causes dual-SD+HD links to send the HD link to jDownloader, even with 'Prefer HD Links' off...

Thanks for another good reason to use XBMC!
Great plugin, thank you.

Any chance you could add Megaupload to the list somewhen?
Hi kreeturez, Hi msimmo!

Thanks for your input, I'll add the requested features in the next release.
pgoeri Wrote:Hi kreeturez, Hi msimmo!

Thanks for your input, I'll add the requested features in the next release.

Awesome; thanks pgoeri!

msimmo Wrote:Any chance you could add Megaupload to the list somewhen?

This would be great; though the Megaupload links tend to be concealed within MultiUpload links on the site for recent posts. Which is fine - since JDownloader plays nice with MultiUpload links (so these links could be used); the only issue being that all possible mirrors in the MultiUpload link get passed to it.
The solution is mentioned on the JDowloader forum in this post: simply disable the file hosters you don't want to be used within JD itself and those mirrors will be ignored when downloading from MultiUpload.
So this would be a great addition to the plugin!!
great plugin works perfect.

Thanks
kreeturez Wrote:Likewise on occasion, the preferred hoaster selected causes dual-SD+HD links to send the HD link to jDownloader, even with 'Prefer HD Links' off...

I rely on the fact that the post have always the same format. So it could happen that the link scraping doesn't work correct for some posts. It would be great, if you could send me such posts (only the link) via pm.
Code:
[b]Version 0.2.0[/b]
    * Refactoring (Using now the library DDLScraper)
    * New sub categories: Mac Applications & Windows Applications
    * New supported filehoster: megaupload.com, multiupload.com, wupload.com, oron.com
    * New setting: 'Alternative Filehoster 3' (due to the huge amount of filehoster)

Planned feature:
* Predefined (Only HD, No HD, ...) and custom filters
Hi, I just tried your plugin and I can start it and see categories but I don't get any results. Every category is empty. Do you have any idea why is it?

I am on dsplayer enabled xbmc pre eden GIT: 20111030. I also have aeon mq3 skin on it.
OneDll.com has changed their design and due to this the plugin is currently broken. I'll fix it ASAP.

quick fix:

file: plugin.download.oneddl/resources/lib/OneDDLCore.py

replace
Code:
def _trimPosts(self, full_website):
        # get only area with the file-links (ignore samples and comments)
        website = self._executeRE('id="more-(.+?)class="postmeta"', full_website)
        if (website == None):
            # not possible to extract link section, use the whole website for scraping
            website = full_website
        return website

with

Code:
def _trimPosts(self, full_website):
        # get only area with the file-links (ignore samples and comments)
        website = self._executeRE('id="more-(.+?)class="postmeta"', full_website)
        if (website == None):
            website = self._executeRE('id="more-(.+?)<!-- .entry-content -->', full_website)
        if (website == None):
            # not possible to extract link section, use the whole website for scraping
            website = full_website
        return website
Just tried it, unfortunately it does not work :/
It's the same, see all categories but all empty...

btw thanks for fast reply! Big Grin
Code:
[b]Version 0.2.1[/b]
    * Fixed problems with the new website design
    * Ignore duplicate filehoster links
    * New sub categories: HDRips & FLAC

@voodoofox sorry, I posted the wrong fix.
Pages: 1 2 3