TW201105467A - Robotic fashion model - Google Patents

Robotic fashion model Download PDF

Info

Publication number
TW201105467A
TW201105467A TW98126134A TW98126134A TW201105467A TW 201105467 A TW201105467 A TW 201105467A TW 98126134 A TW98126134 A TW 98126134A TW 98126134 A TW98126134 A TW 98126134A TW 201105467 A TW201105467 A TW 201105467A
Authority
TW
Taiwan
Prior art keywords
model
fashion
movements
joints
controller
Prior art date
Application number
TW98126134A
Other languages
Chinese (zh)
Inventor
Houng Sun
Yi-Hao Chiu
Original Assignee
Mirle Automation Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mirle Automation Corp filed Critical Mirle Automation Corp
Priority to TW98126134A priority Critical patent/TW201105467A/en
Publication of TW201105467A publication Critical patent/TW201105467A/en

Links

Landscapes

  • Toys (AREA)

Abstract

Currently most fashion dresses are shown statically with a model in the display window. The consumers can not see the complete features of the dress and many business opportunities may be lost due to the ineffective way of demonstration. A robotic fashion model which can exhibit the fashion dresses and imitate some of professional fashion model's poses and movements may attract people's attention and provide more imagination space. All the joints of the robotic fashion model can be driven by servo or stepper motors. The motions of all the servo or stepper motors can be controlled by a dedicated servo or stepper controller, a PLC or PC to make the movements of all the joints look coordinated. If the model is customized for a special purpose, the controller can be simplified to a single chip and the size and cost can be reduced. The motion of the joints is designed according to what movements the model needs to show. The controller coordinates motions of all the motors. Hence, the model can imitate the graceful movements of a fashion model. In addition, the model can have eye and mouth expressions and voices which are in harmony with its movements. An infrared sensor can be installed in the model and triggers the demonstration movements as the viewers approach.

Description

201105467 六、發明說明: 【發明所屬之技術領域】 [0001] 本發明是一種新的構思,讓機器人穿上時裝,利用控制 器的控制協同動作能力,配合語音的介紹控制及各關節 的動作,以達到展示的各種動作。 讓參觀者了解時裝設計的重點,提高參觀的人購買的意 願,如果這份工作要由真正的模特而來做,一個人恐無 法長時間持續做相同的展示動作。 【先前技掏】 [0002] 以往櫥窗的時裝展示,均為穿上服裝的模特兒做靜態的 展示,靜態的展示無法將服裝所有的優點展示在消費者 的眼前。 【發明内容】 [0003] 時裝展示機器人是由伺服或步進馬達驅動各個關節,並 由控制器控制各個關節做出協同的動作,使其作出各種 展示的動作。制式化的時裝展示機器人還可以簡化到由 單晶片的控制器來控制,以降低控制器的體積及成本。 機器人依展示動作的需要來設計其活動的關節,控制器 不但控制單一馬達的正反轉及速度的快慢,還可以同時 控制所有關節的馬達配合語音的介紹一起協同動作,以 做出優美的擬人化動作,展現出語音所介紹的優點。 此外機器人還可以增加臉部眼睛、嘴巴的動作,以配合 身體的動作或講話的口型變化,此外還可以增加紅外線 等感應器,使機器人感應到有人接近時才開始做展示的 動作。 表單編號A0101 第4頁/共14頁 201105467 以下藉由具體實施例配合所附的圖示詳加說明,當更容 易瞭解本發明之目的、技術内容、特點及其所達成之功 效。 【實施方式】 [0004] 請參閱第1圖,1到30可動的部位,其中部位1眼睛、部位 2嘴型的動作是利用電磁閥或是任何可做伸張及收縮動作 的制動裝置做配合語音由PLC控制做開闔的動作,部位6 手肘及部位9膝蓋是做伸曲的動作,其他部位都是做旋轉 的動作。 部位2 0驅動馬達作動,帶動部位3 0旋轉軸作正反向的旋 轉動作,部位50螺桿的正反轉帶動部位60齒輪做伸曲的 動作,這些驅動的馬達都是伺服或步進馬達,我們利用 伺服或步進馬達可以精確的控制旋轉角度,以及複數個 的伺服或步進馬達控制器控制同時控制多軸,做出多種 姿態的協同動作。時裝展示機器人能經由感應器啟動展 示動作,整個展示的程序還要配合語音的介紹,動作也 須配合介紹内容做出各種展示的動作。 使用者不需要懂PLC的程式語言,操作機器人只要用控制 器的操作搖桿,將各關節帶到想要的角度記憶起來,這 就是所謂的「教點」,以教點做出各種不同的姿態記憶 起來,然後將記憶的動作連續操作,就可以做出模擬真 人的動作。 大量製做甚至可以簡I匕為單晶片的控制器,設定好數種 固定的展示動作,客戶可自由選擇,控制器並可以經由 部位12的紅外線感應器啟動展示作業。 所以在上述的全部或部分的構想組合下所製作出來的服 表單編號A0101 第5頁/共14頁 201105467 裝展示機器人,均屬此發明專利。 【圖式簡單說明】 剛帛1圖為模特兒人形,是可依需要設置活動關節的 部位,12紅外線感絲設置於模特兒的腹部。感應器設 在這個位置可絲人㈣特兒纽於與參觀者相同水平 位置的展示平台或是較高的展示平台以感應參觀者的到 來。1眼睛及2嘴型的動作可由電磁彈菁或是類似的驅動 裝置’與揚聲器一同置於頭殼内。201105467 VI. Description of the Invention: [Technical Field to Be Invented by the Invention] [0001] The present invention is a new concept for allowing a robot to put on fashion, using the control and coordination ability of the controller, in conjunction with the introduction control of the voice, and the movement of each joint. In order to achieve the various actions of the show. Let visitors understand the key points of fashion design and increase the willingness of visitors to buy. If the job is to be done by a real model, one person may not be able to continue to perform the same display action for a long time. [Previous Technology] [0002] In the past, the fashion display of the window display was static display of the models wearing the clothes. The static display could not show all the advantages of the clothing in front of the consumers. SUMMARY OF THE INVENTION [0003] A fashion display robot drives a joint by a servo or a stepping motor, and the controller controls each joint to perform a coordinated action to perform various display actions. The standardized fashion display robot can also be simplified to be controlled by a single-chip controller to reduce the size and cost of the controller. The robot designs its active joints according to the needs of the display action. The controller not only controls the forward and reverse rotation of the single motor and the speed of the speed, but also controls the motor of all joints to cooperate with the introduction of the voice to make a beautiful personification. The action shows the advantages introduced by the voice. In addition, the robot can also increase the movements of the face and mouth of the face to match the movement of the body or the mouth shape of the speech. In addition, sensors such as infrared rays can be added to make the robot sense that when someone approaches, the display action begins. Form No. A0101 Page 4 of 14 201105467 The following is a detailed description of the specific embodiments, the technical contents, the features and the effects achieved by the accompanying drawings. [Embodiment] [0004] Please refer to Fig. 1, the movable parts of 1 to 30, in which the action of the mouth type of the eye 1 and the part 2 is made by using a solenoid valve or any brake device capable of stretching and contracting. The movement of the opening is controlled by the PLC. The part 6 elbow and the part 9 knee are the action of making the extension, and the other parts are the rotation movement. The part 20 drives the motor to act, and the rotating part of the rotating part of the part 30 rotates in the forward and reverse directions. The positive and negative parts of the screw of the part 50 drive the gears of the part 60 to make the bending action. These driven motors are servo or stepping motors. We use servo or stepper motors to precisely control the angle of rotation, and a number of servo or stepper motor controllers control multiple axes at the same time to create a variety of gestures. The fashion display robot can initiate the display action via the sensor, and the entire display program must be accompanied by the introduction of the voice, and the action must also be accompanied by the introduction content to make various display actions. The user does not need to understand the programming language of the PLC. The operating robot only needs to use the joystick of the controller to bring the joints to the desired angle to remember. This is the so-called "teaching point", and the teaching points are made differently. The gesture is remembered, and then the action of the memory is continuously operated, and the action of simulating the real person can be made. A large number of controllers can be made even as a single-chip controller, and several fixed display actions can be set, which the customer can freely select, and the controller can start the display operation via the infrared sensor of the part 12. Therefore, the service form No. A0101, which is produced under the above-mentioned all or part of the concept combination, is a patent for this invention. [Simple description of the figure] The picture of Ganggang 1 is a model human figure, which is a part where the movable joint can be set as needed, and 12 infrared rays are placed on the abdomen of the model. The sensor is located in this position. The Silker (4) specializes in a display platform of the same level as the visitor or a higher display platform to sense the visitor's arrival. 1 Eye and 2 mouth type action can be placed in the head case together with the speaker by an electromagnetic bomb or similar drive unit.

第2圖是頸部驅動馬達,馬達的轉動會帶動頭部的左右轉 動。 第3圖是整隻手臂三個關節的照片。 第4圖是部位4肩部驅動馬達的照片,馬達的正反轉可以 使整個手臂舉起或放下。 第5圖是部位5手腕的外型照片。 第5-1圖是部位5手腕的旋轉軸照片。 第6圖是部位6手肘的照片,部位9膝蓋的構造也跟此構造 相同,50螺桿正反轉會帶動6〇齒輪的正反轉,就是可以 做揮手或是彳申縮腿的動作。 第7圖是部位7腰部A-A斷面的上半部圖,3〇是旋轉軸,他 可以帶動整個上半身做水平的左右旋轉。 第8圖是部位7腰部A-A斷面的下半圖,20驅動馬達會帶動 第7圖的30旋轉軸做水平的左右旋轉動作。 第9圖為部位12的紅外線感應器,他會感應到客人的接近 ,而啟動機器人展示的作業。 其餘未敘述的部位都是如圖8及圖g所示的構造及的動作 方式’各活動的關節的動作聽命於控制器,配合介紹的 表單編號A0101 第6頁/共14頁 201105467 語音在同一時間做出不同角度的變化,就可以展現各種 不同的動作。 這些動作不需要經由程式的書寫,只用控制器教導的方 式將各關節帶到所想要的位置記憶後,將_連串的記憶 動作連續的仙起來’就可以完成展轉作,非常的易 於操作。將不同系列的動作記憶起來,將來做動作的選 擇也非常的方便。 如果要作低成本的展示機器人,利用單晶片控制器,輸 人固^的幾個動作’使用者更可以輕鬆的用數字來選擇 所要做的展示動作,連教導的動作都不需要了。 【主要元件符號說明】 [0006] 10 11 20 30 50 60 模特兒可依需要設置活動關節的部位 紅外線感應器 驅動馬達 旋轉軸 螺桿 齒輪 表單編號A0101 第7頁/共14頁Figure 2 is the neck drive motor. The rotation of the motor will drive the left and right rotation of the head. Figure 3 is a photograph of the three joints of the entire arm. Figure 4 is a photograph of the shoulder drive motor of the section 4, and the forward and reverse rotation of the motor can lift or lower the entire arm. Figure 5 is a photograph of the appearance of the wrist of the part 5. Figure 5-1 is a photograph of the rotation axis of the wrist of the part 5. Figure 6 is a photograph of the part 6 elbow. The structure of the part 9 knee is also the same as this structure. The 50 screw forward and reverse will drive the positive and negative rotation of the 6-inch gear, which means that it can be waved or squatted. Figure 7 is the upper half of the section A-A of the waist of the part 7, and 3 is the axis of rotation, which can drive the entire upper body to make a horizontal left and right rotation. Fig. 8 is a lower half of the section A-A of the waist portion of the portion 7, and the 20 drive motor drives the 30 rotation axis of Fig. 7 to perform horizontal left and right rotation. Figure 9 is the infrared sensor of the part 12, which will sense the approach of the guest and start the work displayed by the robot. The rest of the undescribed parts are the structure and the operation mode shown in Fig. 8 and Fig. 'The movement of the joints of each activity is listened to by the controller, and the form number A0101 is introduced in conjunction with the introduction. Page 6 / 14 pages 201105467 The voice is in the same Time can be varied from different angles to show a variety of different actions. These actions do not need to be written by the program, and only the controllers are used to bring the joints to the desired position memory, and then the _ series of memory movements are successively sacred. operating. Remembering the different series of actions, the choice of action in the future is also very convenient. If you want to make a low-cost display robot, you can use the single-chip controller to input several actions. Users can easily use the numbers to select the display actions to be performed, even the teaching actions are not needed. [Main component symbol description] [0006] 10 11 20 30 50 60 The model can set the part of the movable joint as needed. Infrared sensor Drive motor Rotary shaft Screw Gear Form No. A0101 Page 7 of 14

Claims (1)

201105467 七、申請專利範圍: 1 . 一個機器人能穿著展示的時裝,並按照程式以各種已經設 計好的姿態,向參觀者展示該時裝的設計重點。 2 .眼睛、嘴巴可配合預錄的口語介紹詞作動。 3 .各活動關節可配合介紹詞做出設計好的展示動作。201105467 VII. Scope of application for patents: 1. A robot can wear the displayed fashion and show the visitors the design focus of the fashion according to the program in various designed postures. 2. Eyes and mouth can be used in conjunction with pre-recorded oral introductions. 3. Each active joint can be designed with the introduction words to make a good display action. 表單编號A0101 第8頁/共14頁Form No. A0101 Page 8 of 14
TW98126134A 2009-08-04 2009-08-04 Robotic fashion model TW201105467A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW98126134A TW201105467A (en) 2009-08-04 2009-08-04 Robotic fashion model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW98126134A TW201105467A (en) 2009-08-04 2009-08-04 Robotic fashion model

Publications (1)

Publication Number Publication Date
TW201105467A true TW201105467A (en) 2011-02-16

Family

ID=44814019

Family Applications (1)

Application Number Title Priority Date Filing Date
TW98126134A TW201105467A (en) 2009-08-04 2009-08-04 Robotic fashion model

Country Status (1)

Country Link
TW (1) TW201105467A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10367394B2 (en) 2016-07-12 2019-07-30 Industrial Technology Research Institute Electromagnetic spring and elastic actuator having the same
CN112578716A (en) * 2020-12-23 2021-03-30 福建(泉州)哈工大工程技术研究院 Humanoid robot control system for shoe and clothes display

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10367394B2 (en) 2016-07-12 2019-07-30 Industrial Technology Research Institute Electromagnetic spring and elastic actuator having the same
CN112578716A (en) * 2020-12-23 2021-03-30 福建(泉州)哈工大工程技术研究院 Humanoid robot control system for shoe and clothes display

Similar Documents

Publication Publication Date Title
Salichs et al. Mini: a new social robot for the elderly
Ishiguro Android science: conscious and subconscious recognition
Suguitan et al. Blossom: A handcrafted open-source robot
Yohanan et al. The role of affective touch in human-robot interaction: Human intent and expectations in touching the haptic creature
Genç et al. Exploring computational materials as fashion materials: Recommendations for designing fashionable wearables
TWI421120B (en) Facial expression changeable robot head and method of manufacturing virtual face skin thereof
Takacs Special education and rehabilitation: teaching and healing with interactive graphics
Ladenheim et al. Live dance performance investigating the feminine cyborg metaphor with a motion-activated wearable robot
WO2020190362A2 (en) A social interaction robot
Lee et al. The design of a semi-autonomous robot avatar for family communication and education
Shidujaman et al. “roboquin”: A mannequin robot with natural humanoid movements
Short et al. Sprite: Stewart platform robot for interactive tabletop engagement
TW201105467A (en) Robotic fashion model
Dhanasree et al. Hospital emergency room training using virtual reality and leap motion sensor
Vandevelde et al. An open platform for the design of social robot embodiments for face-to-face communication
Maneewarn Survey of social robots in Thailand
Lee et al. Semi-autonomous robot avatar as a medium for family communication and education
Kim et al. Biologically inspired models and hardware for emotive facial expressions
KR20080078324A (en) Self-regulating feeling representation system of robot
Hackel et al. Designing a sociable humanoid robot for interdisciplinary research
CN210606303U (en) Robot display system
Basori et al. Emotional facial expression based on action units and facial muscle
Zheng et al. Interpretation of human and robot emblematic gestures: How do they differ
Oh et al. Automatic emotional expression of a face robot by using a reactive behavior decision model
Dai et al. A virtual companion empty-nest elderly dining system based on virtual avatars