360° spherical renders for immersive VR viewing

‘Subjectivity’ and the Designer’s Intuition

The emotional or ‘affective’ qualities of space have long been referred to as ‘intangible’ and ‘subjective’ aspects that cannot easily be empirically analyzed. Designers thus rely heavily on their own intuitive processes when dealing with the realm of spatial affect. Can the positionality and intuitions of a designer be supplemented with machine learning models trained on user-specific datasets, thus responding to the ‘subjective’ affective nuances of the target occupant(s)?

This project outlines a preliminary design assistance workflow for rapid user-specific data collection and predictive affect-analysis of enclosures in early stage design.

Find the publication here.  The complete volume can be downloaded here. 

This framework for ‘customized emotional design’ comprises of two main components – the Data Collection Workflow and the Affect Analysis Workflow. The former integrates the collection of rapid user specific affective datasets into the early stage design process, while the latter draws upon such collected datasets for real-time custom user specific affective evaluation of a design.

The Data Collection Workflow

For demonstration, the Data Collection Workflow was adapted for the Rhino3d + Grasshopper platform. A custom Python script within Rhino generated 100 spatial enclosures (all bedrooms) randomized along 9 spatial parameters, namely length, width, ceiling height, sill height, lintel height, no of windows, window position, total window width and wall hue. A bed, wardrobe, ceiling fan and tube-light were the only furniture/fixture items placed within each of the spaces, in fixed relative locations. These objects were placed primarily to anchor the functional specificity of the enclosure as a bedroom. In addition a dummy human figure was placed near the opposite wall to anchor scale. All such items were rendered in white, in order to retain primary focus on the spatial parameters of the enclosure. Eye height was fixed at 1500 mm and camera position was fixed near one corner of the room, to allow for the subject to get a complete view of the room from the vantage point.

scenes

Randomised spatial enclosures.

A custom render pipeline was scripted in Python to automatically generate 360° spherical renders for each of the randomized enclosures using VRay for Rhino. A dataset comprising of the feature values for each of the enclosures was also generated as a .csv file.

render_gif_short

Automated enclosure generation and  360° spherical render pipeline

360° spherical renders for immersive VR viewing

 360° spherical renders for immersive VR viewing

The renders were then adapted for immersive stereoscopic VR viewing using the Fulldrive engine and a Head Mounted Display (HMD) unit (Isuru Play VR). The environment was also live-streamed to a laptop for real time monitoring. 5 subjects were exposed to each scene for approximately 20-30 seconds and were asked to rate the enclosures.

vr_setup

The Head Mounted Display setup (left) and the immersive VR scene for a sample enclosure (right)

The ratings were in the form of affective appraisals on an EmojiGrid (Toet et al., 2019,) that appeared within the environment, near the observer’s foot. The subjects indicated a point on the EmojiGrid, and response was converted to Valence and Arousal values on a 9-point Likert scale between -4 and +4.

emojigrid_annot

The EmojiGrid (Toet et al., 2019)

The aim of the Data Collection Workflow is thus to integrate the rapid collection of user specific affective data in early stage design, for machine learning models to learn the affective preferences of the user. This serves as the foundation for real time design assistance.

final_frames_small

360° spherical VR scenes and corresponding responses on the EmojiGrid (inset).

params

Key dataset parameters (Subject 3).

Affect Analysis Workflow

The affect analysis workflow loads custom user specific datasets for predictive real time affective analysis of designed enclosures. For demonstration, the analysis framework was scripted as a custom python component within Grasshopper3d using the GHCPython component (Abdelrahman 2017). The key components of the test enclosure (such as walls, windows, floors and ceilings) had to be first defined in the Grasshopper script. The script would them compute the values for the 9 key spatial parameters. This would serve as test data for the machine learning models encoded within the custom python component to make real time predictions.

Picture1

KNN Regression for Affect Prediction

In the current version of the framework, the python script applied a K-Nearest Neighbor (KNN) regression algorithm on the training data sets using the scikit-learn library (Buitinck et al. 2013), in order to make predictions of Valence and Arousal ratings. For the 5 demonstrative subjects, the model was able to predict the ratings with average root mean squared errors of 1.37 and 1.35 for valence and arousal respectively (on 9-point scales between -4 to +4). The table (left) summarizes the key error metrics for the framework. On the whole, methodology 2 allowed for greater consistency of responses within the collected dataset, and thus greater accuracy of prediction by the model.

 

plot_singe

Predicted vs Rated values for Valence and Arousal. (Subject 2, Test size = 20)

Design Assistance

For design assistance, the predicted Valence and Arousal ratings as computed by the regression models would be converted into a point on the EmojiGrid, and displayed within the Rhino model space. Any changes to the formal attributes of the enclosure would result in the point being updated in the grid. This thus allowed for a real time predictive analysis of the affective impact of design decisions or design alterations. The figure shows the affective analysis framework in action, predicting different user specific affective states for different spatial configurations for an enclosure in early stage design.

shots_final

The Design Assistance Interface with predicted affective states reflected on the EmojiGrid

analysis

Sample demonstration of the affect analysis workflow. Design assistance based on the training data of Subject 3. 

The design assistance framework described in this paper thus attempts to integrate rapid affective data collection and real-time analysis into the early stage spatial design process. It is intended to serve as a step towards integrating user-specific and ‘subjective’ patterns of affective response into a computational framework, thus opening up possibilities of the emotional customization of spaces. Such predictive models can plug into existing algorithmic and evolutionary floor plan generation frameworks, as a tool for the optimization of affective parameters. There is also scope for such a predictive framework to provide an affective dimension to the future of dynamic or parametrically alterable physical structures, where such enclosures may be able to change their form based on a desired ‘mood’ of the occupant.  It is hoped that such directions towards engaging with user-specific spatial affect opens up new dimensions to the role of the architect in the near future. 

References:

AbdelRahman M 2017 MahmoudAbdelRahman /GH CPython Alpha release of GH CPython plugin Zenodo

Buitinck, L., Louppe , G., Blondel, M., Pedregosa , F., Mueller, A., Grisel, O., Niculae , V., Prettenhofer , P., Gramfort, A., Grobler, J., Layton, R., Vanderplas , J., Joly, A., Holt, B., Varoquaux , G. (2013) API design for machine learning software: experiences from the scikit learn project, arXiv : 1309.0238

Toet, A., and van Erp , J. (2019) The EmojiGrid as a Tool to Assess Experienced and Perceived Emotions , Psych, 1(1), 469 481.

Russell, J. A. (1980) A circumplex model of affect, Journal of Personality and Social Psychology , 39(6),

‘Feeling Spaces’: Machine Learning for Predictive Emotion Analysis

Post navigation


Leave a Reply

Your email address will not be published. Required fields are marked *