@oio last touch without block!


If there was no last touch block, do you think you could still find it he last touch position that didn't involve pressing a sprite and using its location to detect last touched location. I I mean just giving the sprites x, y position as a proxy


That's an odd question. Hmm... Okay.

On first glance, if we eliminate the "last touch" functionality from the get-go, it would seem that the question answers itself. Maybe I should ask... what's still on the table in this scenario?

I guess, if I understood right, I would resort to making a matrix of objects as my touch sensors. Or try rastering - even triangulation.


So... :slightly_smiling:

Now I'm curious. What are you workin'-up? Sounds like a delightfully abstract challenge. Right up your alley...


Ghats what I was thinking. I But having multiple long rectangle sprites spinning the and stopping when pressed Nd using the angles they stop T to find the last touch her.


Okay... You gonna put something together and post it? Or... are you interested in seeing my feeble attempt, if I make one? :smile:


I plan to try and make one.
If my code works, I will publish a link.
Forgive the grammar from a terrible android phone


Terrible Android phone, huh? Well, the poor thing is probably still recovering from having you perform an IR-filterectomy on it! :laughing:


I just figured out a simple way to do this.
Have a horizontal line text moving from the screen bottom to the screen top.
Have a vertical line text moving from the left side of the screen to the right.
When pressing the screen, both the line texts will eventually be under where the screens pressed.
They can stop moving when pressed.
The x position of one and the y position of the other can be used for the last-current touch position.

I'd like to search for projects made before the last touch feature was added to Hopscotch.
Did anyone ever find or try to find a solution to make a last touch code for their project?
@Liza @Ian
Like a draw pad that left a trail from the last touch position when swiping the screen


Sounds like it is going to be great game!


Not really a game. This is more of a code experiment


@Stradyvarious, your strategy sounds very much in line with what your original question made me think. Either you have a static array of objects and, therefore, a finite spatial resolution; or you have a smaller number of moving objects for the purpose of rastering-to-touch, at the expense of temporal resolution (i.e. it would be laggy).

I have a feeling that the array solution is the most compatible with Hopscotch. That's just a hunch. The reason has to do with what I'd expect the sequence of events, underneath the skin of Hopscotch, to be, in order to trigger a "pressed" or "tapped" response in any object. I have not yet played with it, but I will be pleasantly surprised to learn that a moving object that happens to find the location of the user's finger-press will have its "tapped" or "pressed" status set. Maybe, but I kinda doubt it. I don't know if I'm sufficiently motivated (I've got a LOT going on) to explore it, but, short of trying it out, I'd put my chips on Hopscotch only recognizing a "pressed" or "tapped" event, if the finger-press happens at the location where an object resides at the moment of touch. Meh. I guess that's easy enough to go check...

If my pessimism turns out to be unwarranted, then, sure, we might be able to raster the "sensor" or "sensors", as I said in a previous post, and have a rotating or translating or otherwise dynamic object bump-into the spot we're touching. And, from that, voila! We can recover the X and Y coordinates of the touch.

So, I'd probably go with an array. And who knows? Maybe a dynamic array in which elements move or change transparency (so as to become touchable) or size (so as to move their boundaries) within the array would get us the precision we'd want.

Thankfully, we actually have "last-touch_x" and "last-touch_y" for when we need it, but we can experiment with ideas like this, for when we've got an appetite for a challenge.


I haven't tried yet, but i was thinking to have the text lines quickly alternate between visible and invisible to see if a text that becomes visible under where the screens pressed would be classed as being pressed. I still would like to see if anyone made solution before the last touch code was made.
Someone must have at least done something.

@oio You are right.
Unless the screens pressed on the sprites location, the press won't register even if the sprite appears beneath the pressed location.


I'm now inspired to make a primitive DrawPad that will use the y position of a sprite that's pressed on the right side of the screen and the x position of a sprite pressed along the bottom of the screen to triangulate an x-y position for an emoji that will move around the screen based on the x-y coordinates and leave a trail. I'm starting to wonder allot more now why no ones coded a project or put effort into solving some of the problems in Hopscotch like this when the problem existed.

Example, there was no accurate collision detect code made like I did.
And when there was no last touch x-y why didn't someone code a way to move a sprite anywhere quickly in realtime and not just using pre coded moves-blocks like pressing a left arrow to move left etc.


This post was flagged by the community and is temporarily hidden.


I know a few ways. If the Hopscotch Team knew a way back when there was no last touch block, they should have shared this info with the users so they could use it. The fact they didn't do this makes me think they never found a way to do this.


The hopscotch team lol
Not the hopscotch tram


This post was flagged by the community and is temporarily hidden.


Whoa. Thanks for checking that out. It was just a guess.