How would you measure the quality of a Hopscotch project?

Hi All!

I have a question for you: if you wanted to measure the quality of a Hopscotch project through something numerical, what would you use? From a numbers perspective, what do you think would indicate that project X is a better project than project Y?

Examples could be: size of json file, number of blocks, number of objects, etc. I’m curious what you’d use to indicate quality.

Thank you!


I would look at the number of blocks in the code, as well as sometimes the like to play ratio.

Without playing it, I sometimes look at the title as well as the thumbnail – my x-ray feature in the Web Explorer looks at words and colors in the thumbnail as part of the criteria as well.


I agree with E—how many blocks of code and thumbnail play a bit part for me


Well I personally judge it by if it is a cool thumbnail and if it has a lot of code, but not too much that it is lag central


If you want to know my x-ray formula, here it is:

xray mechanism

Project title:

  • Must include at least 5 letters
  • Must not match this regular expression:
    /([a-z0-9])\1{5,}|([?!].*){3,}|([a-z]{0,12},)?[a-z]{0,12}&[a-z]{0,12}|[a-z0-9]{16,}|.{41,}|fan\s?art|\bI think\b|\bremix\b|\bimpossible\b|\bomg\b|Cros[bs]y|\bDont\sdrop\s(your)?\s(phone|📱)|\bannouncement|\bshout\s*?out\b|\brequests?\b|\bpl[zs]\b|\bplease\b|\bif.{0,10}(get).{0,10}like\b|\blike for part\b|\b(so|super)\s(easy|hard)\b|\blike\sbutton\b|\btry(\snot)\s(to)?\b|\bfidget\b|\bspinner\b|[\s|^][bcdefghjklmnpqrtuwxyz][\s$]|(read|see) (in |the )? code/ Translation:
    • Cannot have 5 or more consecutive letters/numbers
    • Must have less than 3 “?” and “!” (combined)
    • Must not be untitled (shows title as “object 1, object 2 & object 3”)
    • No word can be more than 16 characters
    • Must be shorter than 41 characters before any separators – those are (, :, and -.
    • Cannot have “fan art”, “remix”, “i think” (as in I think I did this wrong), “impossible”, “omg”, “announcement”, “shoutout”, “request”, “please/pls/plz”, “like for part x”, “super easy/hard”, “like button”, “try not to”, “fidget spinner”, or “read/see code” in title.
    • Cannot be a “Don’t Drop your phone” or “Crossy Road” project
    • Cannot contain words with a single letter (exceptions: a, o, s, v)

Project Thumbnail:

  • Must Contain at least 3 significant (represents more than 0.7% of the thumbnail) colors.
  • None of the 3 most abundant colors should be extremely close to hot pink (because that is what’s used inside “Draw Like a Pen”, and those projects are 99% nonsense)

Other Conditions:

  • Project will not show up if it has 5 or more likes but less than 3 plays (remove most art – only because it’s not what I am interested in picking)
  • Automatically shows if play count > 15 and thumbnail is good to go.
  • Automatically shows if likes > 50 and play count > 20.

Changes since first x-ray release:

  • Added some words to the no-match expression for the thumbnail and the title.
  • Introduced a new type of remix bar for time spent in a draft. By turning on x-ray, these will appear on projects that were either saved as a draft for less than 3 minutes and projects that were never saved as a draft. This is a pretty good indicator of whether a project is a remix or not, but does not account for making a legitimate project in one sitting. This brings the x-rayed project percentage from approximately 60% to nearly 90% in Newest.

If you don’t know what x-ray is, go to Newest on the web explorer and tap the target button on the bottom left, then tap the eye.


Another framing of the question: how would you judge the quality of Hopscotch projects in aggregate. For example, if I wanted to show that projects from the last 6 months are better than projects during the same period 2 years ago, do you think there is a statistic that would show that? (and do you think that this is true?)


Okay, so here we go:

Estimated time to make the project (if the person that was judging it were to make the project, how long would it take? This assumes that the judge could make it)

Graphics complexity (How complicated are the graphics. Did they stay away from emojis and default characters?)

Aesthetics (just how the project looks all in all)

Compexity of code (Not number of blocks. Variables, loops, statements, etc.)


Complexity of math/physics (did they use math or physics? If so, how high level was it?)


A lot of these are similar and/or overlap, though.


I would use a poll from all of the Hopscotchers.

Both on the forums and the app


Ooh yes, that’s a good one. You can get a pretty good estimation using the UUID and the publish time of the project. It’s actually an attribute on each project card in the Web Explorer if you open Dev Tools on a browser.

I wouldn’t say that completely makes it up, sizing of default assets does change things a lot.

That’s pretty hard to detect with just getting statistics from projects but is a good indicator.

As for block distribution, @The_Vast_Void + @AwesomeOnion, that could be an interesting thing to look at. If anyone is interested in finding a correlation of distributions of certain block types and quality of projects, I recommend tapping on the stats button on this editor I built.


Image for that


One problem is that people seem to make duplicates to save along the way for a lot of the most complex projects which would be harder to track


Hmm, yeah, but many of the average projects in Newest seem to have an age of just a few minutes. Even a couple of weeks stands out a lot.


Yeah, hmm, what about number of variables or set/increase variable blocks? To give more context: I am not looking to identify good projects, but rather looking for ways to show improvements in the community over time–was planning to look at how featured projects have changed.


number of rules maybe? I know a lot of bigger projects I make have more rules


Those are pretty good indicators, as well as the number of check if (else) blocks. Also what @Nobody said – rules and custom rules are pretty good indicators. Basically more loop/control code.


It could be something where we just look at outliers: i.e. the 10 best projects published each month.


I don’t really use these for big projects


Oh sorry I read the question wrong


What do you mean?

Yeah, I get that. I created my criteria based off of the assumption that whoever is judging the project will be able to judge it in-depth.

That would make it harder to judge based off of data, but again, I meant that you would estimate how long it would take you to create.