Confessions of a researchaholic

August 30, 2019

Train sketcher

Filed under: Real — liyiwei @ 7:40 pm
Tags:

After settling down across the aisle this passenger asked me if the train is heading to San Francisco. Soon afterwards he took out several drawing pencils, all short likely due to usage, and started to finish some drawings on a sketchbook. With intense focus he crouched over his work without noticing me sketching him.

View this post on Instagram

Quick sketch of a train sketcher

A post shared by Kublai (@kub1ai) on

August 24, 2019

Illustrator leaves tile

Filed under: Real — liyiwei @ 3:13 pm
Tags:

This is an exercise to help me experience the process of creating repetitive element patterns and tiles.
Basically, I can draw one element, and copy/paste it with different positions/orientations for an initial pattern.
I can then create a grid pattern from the entire selection, fine tune the elements with the handy interactive preview, add the pattern to the swatches, and use the pattern to fill a new rectangle (with the same size as the original pattern).
I can then test it via various tiling tools, such as html or this pattern checker.

During the pattern creation stage I somehow altered the aspect ratio away from the initial square, so the result is a rectangle. I plan to fix this in my next exercise.

August 23, 2019

SIGGRAPHickness

Filed under: Real — liyiwei @ 4:56 pm
Tags: ,

A co-author added a typo “SIGGRAPHickness” into a bib entry today; I guess it means sickness induced by working too hard for SIGGRAPH.
🙂

August 19, 2019

What I told an intern today based on recent experience

Filed under: Real — liyiwei @ 4:50 pm
Tags:

If we do not figure out what we want to do, someone else will, and they might not have our best interest in mind.

August 18, 2019

Quick sketch of a foot

Filed under: Real — liyiwei @ 9:11 pm
Tags:

View this post on Instagram

Quick sketch of a foot

A post shared by Kublai (@kub1ai) on

My idea of a mixed reality device

Filed under: Real — liyiwei @ 5:03 pm
Tags:

A single device that bridges VR and AR by displaying synthetic graphics seamlessly with or without the real environment, captured by some sensor/camera without requiring optical seethrough, consumes low power, tracks eye gazes (e.g., for foveation) and locations (e.g., for overlaying geo information).

These can already be achieved programmatically on some smart phones; glasses can reach a subset of users (e.g., those already wearing prescription glasses or requiring hands-free working conditions).

Disclaimer: this is my personal opinion, conceived and written on my personal computer outside Adobe working hours.

August 4, 2019

SIGGRAPH 2019

Filed under: Real — liyiwei @ 4:20 pm
Tags: ,

I spent most of my time in the experience hall with the workshops and demos.
I did not attend a single paper session.
I could socialize with the paper authors a bit more, but I bump into some of them in the parties.
The more conferences I have attended, the more I believe the value of hands-on experience in trying out new things and interacting with different people than just passively listening to talks (most of which are recorded for later viewing, except for the movie production sessions).

For the entire week (from July 25 to 31) I had dinners in various parties (SIGGRAPH Asia technical papers committee meeting and SIGGRAPH conference).

\paragraph{Sunday}

The cybersickness workshop sounds relevant to my research. When I went there, I saw a presenter talking over slides full of dense texts and felt sickness without HMD already. So I left.

I know about the turtle visual programming, but this is the first time I tried hands-on with the Turtle-Stitch embroidery machine.
Basically, you control the turtle (pointer) movement via blocks and write a program by chaining the actions together. Warnings will show if any step may cause manufacturing issues, such as large strides. The program can be saved to a weaving machine to produce an embroidery.

View this post on Instagram

TurtleStitch embroidery – wow smiley face

A post shared by Kublai (@kub1ai) on

The VR/AR magic session much more interesting with several talks from the industry.
For stationary players, locomotion in games can be achieved via hand controllers for speed and turn.
Synthetic blinks in the rendering can be added for teleportation and turn without eye tracking.
Magic Leap presents Mica, which looks very realistic during the presentation which made me wonder if the company is considering pivoting towards content creation if the hardware does not pan out.
I made a mental note to try the demo to see how opaque the graphics really is.
Mica can be animated with the geometry model plus helper joints.
Particular emphasis is on gaze and attention simulation to focus on face social triangle for face and saccade simulation.
The game porting talk is boring to me so I did not pay much attention.
The sonic immersion talk is very good; the two presenters skillfully use non-VR demos to demonstrate the importance of sound effects for immersive ambisonics.
More information can be found about ICTUS audio.

I went to lunch at Cow Cafe (a vegetarian place near the LA convention center south hall). The egg scrambles were good.

I spent the entire afternoon in the immersive pavilion.
Heterotopias performs cinematic cuts during blinks; the idea sounds very interesting but the demo does not really work for me as my glasses might have prevented accurate eye tracking.

I forgot that I actually had no paper in SIGGRAPH 2019 and thus would have to come into the papers fast forward with everyone else, and ended up sitting on the first row.
I had some quasi-dinner in the Beijing academy and Taipei parties afterwards.
I bumped into Kurt and recommended him to look at the Shard/Glasswing demo; he said that by nature he will ask a lot of pointed technical questions, and I told him that Gavin would be there to answer.

\paragraph{Monday}

I went to register for the Mica demo, and bumped into Zhenyi in the queue. She mentioned goo.gl/x5RQHN as a related work.

I could not get into the NASA VR session, so I went to the experience art talk (AI and embodied experience) instead.
Lavin trains a junky neural network to output limited vocabulary for a person image, and renders the corresponding 3D shapes in VR.
The hugging sculpture photogrammetry is also interesting.

Cow Cafe ran out of materials for scramble eggs, so I ended up with a less tasty avocado toast. I guess they also ran out of arugula and gave me romaine lettuce instead.

Mica demo has interesting interactivity, the graphics is not as opaque as it should be (as I expected) but it is ok. I did not try to go off the script as the aim is visual not interaction. But I still managed to break the demo in 10 seconds.

On my way to the keynote Johannes told me it was really bad and unprepared. It is unscripted, but not as bad as he commented. Sometimes we can learn more about the speaker from a spontaneous talk.

The Spheres VR movie is very good; Darren Aronofsky is listed as a contributor.

The de-noising session looks interesting so I went there.
Color space sculpting seems like a simple and effective idea for color edit.
I doubt how general the neural-network de-speckle approach can be.

When I arrived at the NVIDIA party after the only electronic theater show, the hotel staff was already starting to take away the food.

\paragraph{Tuesday}

I went to the After Effects creative medium workshop. I cannot really follow due to small display text but managed to experiment with the tutorial file a bit to get the gist (usual scripting and compositing stuff over the video time line).
Basically, there are people using Ae purely for synthetic effects instead of video post processing.

I then went to the hardwearable session.
Missed most of the NVIDIA AR eyeglass talk.
The electric control plastic haptic device looked interesting and I made a mental note to try in etech.
Artificial tail and temperature control auxetic are also interesting.

The interactive street art AR workshop is fun and reminded me about Aero authoring.

Real-time live was fun as usual. Hair VR did not win the best show but Liwen and Hao did a great demo (with Jun’s head).

I skipped the Stanford reunion and went straight to the Facebook party; the venue is great!
After giving my drink tickets to Kari, he told me that the Stanford reunion got great drinks.


I did not manage to attend the Adobe party before closing.

\paragraph{Wednesday}

I spent most of the morning in the VR demos.
The guy in the autism AR app told me about how png used to support vector graphics.
The OVS + tumor VR app looked and interacted very well.
The VR redirected walking space station demo (Frank Steinickie) worked very well by simply using highly occluded environment with task distraction without even leveraging blinks or saccades.
The unreal workshop is fun; one can place geometry scene structures for a character to move around for a simple game level.
Being Henry VR movie with gaze and blink control works very well and realistically simulates the situation of someone who could not move except with eyes and a hand.

After accidentally attending the women lunch in UIST 2016, I bravely attended the Berthouzoz women lunch in SIGGRAPH and even more bravely volunteered as a discussion leader.
The speakers were inspirational but I had to cut short the lunch to attend a meeting with Illustrator guys coming all the way from India.
The meeting was very productive and I managed to come up with four potential projects.

The DreamWorks party got good food; it took place inside the convention center, so very easy to attend.

The game of thrones production session was fun even though I never watched the show.
It was too late to attend the Snap party afterwards, so I went to the Apple party. The Canadian party was right around my hotel but I was too sleepy to go.

\paragraph{Thursday}

I got up early for the Glasswing talk session at 9 am, and tried some VR stuff afterwards, such as T-Rex assembly, beach muscle building, etc.

I did not realize the physical pad (Bamboo) is actually linked with Wacom Inkspace, and had a great time experimenting with the hardware.

Autodesk fusion 360 create cad shapes to guide drawing in sketchbook, I got the idea even though I could not create complex shapes.

I bumped into Jonash and lunched with him in Cow Cafe (which kindly gave me a discount code).
I had to miss the Alita production session for flight, which was delayed for 2 hours. I plan not to cut short of the conference in the future unless I really must fit a flight schedule (not between LAX and SFO).

TODO: checkout the blackhole talk, which people told me was great and should be the keynote. Why they put these really interesting forwward looking stuff in the early mornings?

August 1, 2019

Quick still life experiment with Wacom Inkspace

Filed under: Real — liyiwei @ 11:36 am
Tags:

This is a combination of physical and digital drawings.
Install the Inkspace app on a mobile device (I used my moto x4 phone), and connect it (via bluetooth) with a Bamboo smartpad with physical papers. All my drawings on the smartpad were synchronized in real-time with the Inkspace app on the phone.
There is only pen, no eraser, on the physical pad.

I like the tactile feedback of the physical pen and paper, but would probably prefer direct drawing into a tablet for more direct editing options (layers, colors, undo/redo, strokes, etc.) without having to switch between the physical pad and virtual app.

Theme: Rubric. Get a free blog at WordPress.com