Android 뉴스 
ADC Winners result expected on Monday (2008.05.12)
작성일 2008-05-09 (금) 00:44
ㆍ추천: 0  ㆍ조회: 2513      
May 12, 2008. Monday.
Your android application is not in the top 50 ADC winners list.
Dear Android Developer Challenge losers,
I know you are drunk. I know you are hung over. I know you are feeling terrible. I know you are at your lows. I know life sucks. I know you can't get out of your bed. But wake up, will ya?
Your android application is not in the top 50 winners list. Cool. Take a deep breath. Look outside. Anything changed? Nah.. nothing changed. Your wife loves you as much as she did before. Your kids love you as much as they did before. Your friends love you just like before. You must do the same. You must do the same to your Android application. Give it the same love, will ya?
First thing first, Send out your congratulations to the top 50 winners. Yes, you heard me right. Do it now, and do it fast. Better? It felt good right? Yeah I know, It feels great. Wonderful.
Now that you are feeling better, let me tell you this:
Here are some details about the prize money details which dan morill posted some days ago. Hopefully you all will get some money of the fund:
ADC 1 == this $5,000,000 prize event going on now.
ADC 2 == the second $5,000,000 prize event that will begin later this year.
ADC 1 Round 1 == open participation with the deadline of 14 April, with 50 winners
ADC 1 Round 2 == participation limited to the winners of ADC 1 Round 1, with 20 “final” winners
ADC 1 Round 1 Phase 1 == reducing the original set of 1,788 submissions to 100 finalists
ADC 1 Round 1 Phase 2 == picking the 50 ADC 1 Round 1 winners from the 100 finalists
Okay, phew. :) With those definitions, here is where we are:
* We sent out the submissions to judging a few days after the submission deadline of 14 April, and judging began.
* Our 100 or so judges received the judging guidelines we provided, reviewed their assigned submissions, and reported data back to us.
* Late last week, we applied our outlier mitigation techniques, identified the top 100 results, and sent them on to the final, separate panel of 15 or so judges to score and produce the final 50 ADC 1 Round 1 award recipients.
So in other words, we are currently in ADC 1 Round 1 Phase 2 as defined above. Once data from the judges comes in, we will notify the 50 award recipients and ADC 1 Round 2 will begin.
It has not escaped my notice even on vacation that there have been a number of discussions on server hits and so on. Obviously we don’t have access to everyone’s server logs, and we can’t monitor what the judges have actually been doing (nor would we snoop if we could, since that seems really sketchy.) We’ve tried to automate everything we possibly can about the judging process, but the one thing we can’t automate is the actual act of assigning scores, since that requires a human’s brain.
The judges were given fairly detailed guidance on how to calibrate their scores, and what to review. For instance, they are aware that they are supposed to read documentation and do their best to test all the features. In the end, though, each judge is going to test to his or her own satisfaction. I’m not sure how reliable it is to correlate judge reviews with observed server hits. Some apps might have sporadic bugs that prevent network accesses. Some judges may have decided they didn’t need to see a particular feature. And before you cry foul, know that some people who have inquired about “missing” server hits have actually done quite well. Judges are just as likely to say “this is cool, I don’t need to see any more” as they are to say “this is so uncool, I don’t need to see any more.” On the whole, our judges have been excited to participate, and I expect that they are being as conscientious as they can be.
The one thing I can tell you with certainty is that I have answered quite a few private inquiries, and in all but one case the judges responded with legitimate scores, rather than scores that say something went wrong or the review was incomplete. Our only data points are what the judges give us, because that’s the only factor we can’t automate. Since the judges are telling us that they reviewed to their satisfaction, we can only take their word for it.
We’ve tried really hard to make sure that the only thing that affects scoring is what you put in front of the judges. But the entire goal of the ADC is to leverage plain old human judgment.
- Dan
P.S. - watch for gory details on the nuts & bolts of all this in the near future.
이름아이콘 회색
2008-05-10 05:19
결과 나왔네요 개별적으로 메일은 발송되었고 50위 리스트는 12일에 공지된다고 합니다.
이름아이콘 껄뜩이
2008-05-10 13:14
근데 이 글은 어서 퍼오셨나요? ㅋ
표현이 참 재미있네요ㅋㅋ
덧글 쓰기 0
※ 회원등급 레벨 0 이상 읽기가 가능한 게시판입니다.
    N     분류     제목    글쓴이 작성일 조회
746 [ADC1,Winner] SplashPlay 2008-05-12 3129
745 [ADC1,Winner] Pocket Journey 2008-05-12 3449
744 [ADC1,Winner] COMMANDRO 2008-05-12 2783
743 [ADC1,Winner] Android Scan 2008-05-11 4391
742 [ADC1,Winner] TuneWiki for Android 2008-05-09 3437
741 안드로이드 챌린지 I에 참가하셨던 개발자분들. 2008-05-11 3297
740 ADC Winners result expected on Monday (2008.05.12) 2008-05-09 2513
739 Game Garden -- ADC Submission 2008-05-04 2483
738 The Transit Rider’s Guide to the Galaxy 2008-04-23 2703
737 구글러브 ㅋ 2008-04-21 2832
736 Moseycode Release 0.2.0 2008-04-21 2703
735 Android ShapeWriter WritingPad 2008-04-16 2547
734 < New Phone Number & Address Project > 2008-04-15 2780
733 BabelFish 2008-04-12 2458
732 Statistician 2008-04-11 2340
731 PR Web In Plain English 2008-04-11 2452