2014-12-13, 12:18
2014-12-13, 18:28
Ok guys this will be a full revolution
http://blogs.msdn.com/b/dotnet/archive/2...-apis.aspx
we can remove all bwWorkers and make code 1000000000000000 simplier BUT will require still some job
some good reading:
http://msdn.microsoft.com/en-us/library/...eteExample
get ready for AA (async and await) version of Ember
http://blogs.msdn.com/b/dotnet/archive/2...-apis.aspx
we can remove all bwWorkers and make code 1000000000000000 simplier BUT will require still some job
some good reading:
http://msdn.microsoft.com/en-us/library/...eteExample
get ready for AA (async and await) version of Ember
2014-12-13, 20:27
Guys,
who is the one that generated tens (hundreds?!?) of Private Sub mnuMovieSetMarkAskEFanarts_Click
that is not a good practice... is a very bad one.
All menu of the same "family" should have the same handler procedure and pass an ID to switch between them the ID should be stored in the Tag field of the control.
The code is 100000000000000000000000000000000000000000 times more efficient and easier to maintain.
bhoooooooooooooooooooooo
exact number is 245 menu procedures ] 45 existing!!!! Ouch!
who is the one that generated tens (hundreds?!?) of Private Sub mnuMovieSetMarkAskEFanarts_Click
that is not a good practice... is a very bad one.
All menu of the same "family" should have the same handler procedure and pass an ID to switch between them the ID should be stored in the Tag field of the control.
The code is 100000000000000000000000000000000000000000 times more efficient and easier to maintain.
bhoooooooooooooooooooooo
exact number is 245 menu procedures ] 45 existing!!!! Ouch!
2014-12-14, 00:43
Work is continuing - removing all background workers as explain in MSDN best practices.
In this round I will remove all backgroundworkers then I will add the global cancellation as well as the progress management as explained in the article I found and posted.
In this round I will remove all backgroundworkers then I will add the global cancellation as well as the progress management as explained in the article I found and posted.
2014-12-14, 05:14
Do you have synced our latest commits with your fork? :-)
2014-12-14, 12:14
I cannot sync the two Forks...
once my is working enough I will push it to you
once my is working enough I will push it to you
2014-12-14, 12:54
advancing quite quickly to bw removal they are all in the main form.
Once that is done I think I can push the changes so you can harmonize the soruces...
1) after that I will have to add the correct global cancel for awaited procedures AND the visual feedback for the ones that need it
2) after that I will rework the file operations to be async and awaitable
3) after that I will update ALL the scrapers to use the correct awaitable libraries (as wrote in previous posts) and get rid of C# code
Once that is done I think I can push the changes so you can harmonize the soruces...
1) after that I will have to add the correct global cancel for awaited procedures AND the visual feedback for the ones that need it
2) after that I will rework the file operations to be async and awaitable
3) after that I will update ALL the scrapers to use the correct awaitable libraries (as wrote in previous posts) and get rid of C# code
2014-12-14, 13:10
I think we had a lot of useless bw.... there are no time consuming or resource consuming tasks...
will have even to check the timers... we have a lot of them
Funny bwMovieSetInfo does not have a body (DoWork) so is useless...
will have even to check the timers... we have a lot of them
Funny bwMovieSetInfo does not have a body (DoWork) so is useless...
2014-12-14, 13:34
I have left in the code A LOT of comments to allow an easier merge of code and for everyone to more easily understand what happened, and because some of the code (progress update) may be usefull in step 1)
After the three steps above I will remove all commented lines
After the three steps above I will remove all commented lines
2014-12-14, 15:08
(2014-12-14, 13:10)m.savazzi Wrote: [ -> ]Funny bwMovieSetInfo does not have a body (DoWork) so is useless...
Copy-Paste error ;-)
MovieSets don't have MetaData, so you can remove the bw and the "doMI" part in "Private Sub LoadMovieSetInfo".
2014-12-16, 09:29
(2014-12-13, 20:27)m.savazzi Wrote: [ -> ]Guys,
who is the one that generated tens (hundreds?!?) of Private Sub mnuMovieSetMarkAskEFanarts_Click
that is not a good practice... is a very bad one.
All menu of the same "family" should have the same handler procedure and pass an ID to switch between them the ID should be stored in the Tag field of the control.
The code is 100000000000000000000000000000000000000000 times more efficient and easier to maintain.
bhoooooooooooooooooooooo
exact number is 245 menu procedures ] 45 existing!!!! Ouch!
(2014-12-14, 12:54)m.savazzi Wrote: [ -> ]advancing quite quickly to bw removal they are all in the main form.
Once that is done I think I can push the changes so you can harmonize the soruces...
1) after that I will have to add the correct global cancel for awaited procedures AND the visual feedback for the ones that need it
2) after that I will rework the file operations to be async and awaitable
3) after that I will update ALL the scrapers to use the correct awaitable libraries (as wrote in previous posts) and get rid of C# code
Please change only Await/Async relevant things, otherwise it will be a mess to merge your changes with the master branch.
2014-12-18, 22:40
(2014-12-16, 09:29)DanCooper Wrote: [ -> ](2014-12-13, 20:27)m.savazzi Wrote: [ -> ]Guys,
who is the one that generated tens (hundreds?!?) of Private Sub mnuMovieSetMarkAskEFanarts_Click
that is not a good practice... is a very bad one.
All menu of the same "family" should have the same handler procedure and pass an ID to switch between them the ID should be stored in the Tag field of the control.
The code is 100000000000000000000000000000000000000000 times more efficient and easier to maintain.
bhoooooooooooooooooooooo
exact number is 245 menu procedures ] 45 existing!!!! Ouch!
(2014-12-14, 12:54)m.savazzi Wrote: [ -> ]advancing quite quickly to bw removal they are all in the main form.
Once that is done I think I can push the changes so you can harmonize the soruces...
1) after that I will have to add the correct global cancel for awaited procedures AND the visual feedback for the ones that need it
2) after that I will rework the file operations to be async and awaitable
3) after that I will update ALL the scrapers to use the correct awaitable libraries (as wrote in previous posts) and get rid of C# code
Please change only Await/Async relevant things, otherwise it will be a mess to merge your changes with the master branch.
Yessssss do not worry!
It will be already quite challenging like this. I'm commenting and not deleting and not optimizing the code.
That will be point 4)
I should be almost there...
M
2014-12-20, 17:48
Pushed first release to Dan's repository!!!
The fun will begin!
look at this: http://msdn.microsoft.com/en-us/library/hh696703.aspx
will add it to images and other heavy download sections
http://blogs.msdn.com/b/pfxteam/archive/...93335.aspx
http://blogs.msdn.com/b/pfxteam/archive/...35962.aspx
http://blogs.msdn.com/b/pfxteam/archive/...77034.aspx
The fun will begin!
look at this: http://msdn.microsoft.com/en-us/library/hh696703.aspx
will add it to images and other heavy download sections
http://blogs.msdn.com/b/pfxteam/archive/...93335.aspx
http://blogs.msdn.com/b/pfxteam/archive/...35962.aspx
http://blogs.msdn.com/b/pfxteam/archive/...77034.aspx
2014-12-29, 22:37
Hi Dan,
I was trying out the 1.4 branch, and found that scrapping seemed slow (well no worse than 1.3, but I felt that 1.3 was also slow)
(and no I've not got the goear module enabled
I've done some profiling work to locate where and why, and I've done some patches:
https://github.com/cg110/Ember-MM-Newscr...its/master
The main changes are:
Cache the tmdb scraper, it's expensive to create (about .5s, as it talks to the tmdb servers) Needs more work/tidy up to handle other scraper paths, eg moviesets.
tweak alpha2 iso code lookups to use a dictionary (I suspect the other lookups could benefit but the iso2 was hit the most from scraping)
I also did some minor optimizations to the http code, two changes were just tweaking buffers and copying between streams.
However, IsValidURL was changed to just ask the server for the HEAD, not the body, this was causing the verification of the youtube video urls to be slow, presumably as the server starts to find the video file and stream it, so switch to using HEAD seems to have sped things up, and is probably the right thing to do
Thoughts?
Do you want pull requests for any of them?
I was trying out the 1.4 branch, and found that scrapping seemed slow (well no worse than 1.3, but I felt that 1.3 was also slow)
(and no I've not got the goear module enabled
I've done some profiling work to locate where and why, and I've done some patches:
https://github.com/cg110/Ember-MM-Newscr...its/master
The main changes are:
Cache the tmdb scraper, it's expensive to create (about .5s, as it talks to the tmdb servers) Needs more work/tidy up to handle other scraper paths, eg moviesets.
tweak alpha2 iso code lookups to use a dictionary (I suspect the other lookups could benefit but the iso2 was hit the most from scraping)
I also did some minor optimizations to the http code, two changes were just tweaking buffers and copying between streams.
However, IsValidURL was changed to just ask the server for the HEAD, not the body, this was causing the verification of the youtube video urls to be slow, presumably as the server starts to find the video file and stream it, so switch to using HEAD seems to have sped things up, and is probably the right thing to do
Thoughts?
Do you want pull requests for any of them?
2014-12-31, 02:56
(2014-12-29, 22:37)cg110 Wrote: [ -> ]Hi Dan,
I was trying out the 1.4 branch, and found that scrapping seemed slow (well no worse than 1.3, but I felt that 1.3 was also slow)
(and no I've not got the goear module enabled
I've done some profiling work to locate where and why, and I've done some patches:
https://github.com/cg110/Ember-MM-Newscr...its/master
The main changes are:
Cache the tmdb scraper, it's expensive to create (about .5s, as it talks to the tmdb servers) Needs more work/tidy up to handle other scraper paths, eg moviesets.
tweak alpha2 iso code lookups to use a dictionary (I suspect the other lookups could benefit but the iso2 was hit the most from scraping)
I also did some minor optimizations to the http code, two changes were just tweaking buffers and copying between streams.
However, IsValidURL was changed to just ask the server for the HEAD, not the body, this was causing the verification of the youtube video urls to be slow, presumably as the server starts to find the video file and stream it, so switch to using HEAD seems to have sped things up, and is probably the right thing to do
Thoughts?
Do you want pull requests for any of them?
I will check your changes, thx mate.