[go: up one dir, main page]

WO2018043693A1 - Game program, method, and information processing device - Google Patents

Game program, method, and information processing device Download PDF

Info

Publication number
WO2018043693A1
WO2018043693A1 PCT/JP2017/031526 JP2017031526W WO2018043693A1 WO 2018043693 A1 WO2018043693 A1 WO 2018043693A1 JP 2017031526 W JP2017031526 W JP 2017031526W WO 2018043693 A1 WO2018043693 A1 WO 2018043693A1
Authority
WO
WIPO (PCT)
Prior art keywords
action
game
character
game program
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2017/031526
Other languages
French (fr)
Japanese (ja)
Inventor
大輔 村野
誠 川又
裕史 渡邉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Colopl Inc
Original Assignee
Colopl Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2016171246A external-priority patent/JP6190505B1/en
Priority claimed from JP2017158755A external-priority patent/JP2019034075A/en
Application filed by Colopl Inc filed Critical Colopl Inc
Publication of WO2018043693A1 publication Critical patent/WO2018043693A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present disclosure relates to a game program, a method, and an information processing apparatus.
  • Patent Document 1 describes an interface program for progressing a game by a touch input by a user.
  • the purpose of this disclosure is to provide a more interesting action game.
  • a game program is a game program executed by a computer including a processor, a memory, and a touch screen.
  • a game based on a game program is a game that causes an operation character operated by a user to execute an action associated with the operation when an operation on the touch screen is accepted.
  • the game program causes the processor to execute a step of displaying an editing screen that associates each operation included in the operation sequence including continuous operations on the touch screen with an action according to the order in the operation sequence of the operation.
  • a method for executing a game program.
  • the game program is executed by a computer including a processor, a memory, and a touch screen.
  • a game based on the game program receives an operation on the touch screen, the user performs an action associated with the operation.
  • This is a game to be executed by an operating character to be operated.
  • the processor displays an editing screen in which each operation included in an operation sequence including continuous operations on the touch screen is associated with an action according to the order in the operation sequence of the operation.
  • an information processing apparatus includes a storage unit that stores a game program, and a control unit that controls the operation of the information processing apparatus by executing the game program. And a touch screen.
  • a game based on a game program is a game that causes an operation character operated by a user to execute an action associated with the operation when an operation on the touch screen is accepted.
  • the control unit displays an editing screen that associates each operation included in the operation sequence including continuous operations on the touch screen with an action according to the order of the operation sequence of the operation.
  • (A) And (B) is a figure which illustrates typically an example of the detection method of rotation operation.
  • (A) And (B) is a figure which shows the specific example of the edit screen displayed on a display part. It is a flowchart which shows the flow of the process performed in an edit screen based on the game program which concerns on this embodiment.
  • (A)-(C) are figures which show the specific example of the edit screen displayed on a display part. It is a flowchart which shows the flow of the process performed according to continuous operation based on the game program which concerns on this embodiment.
  • (A) to (C) are diagrams showing specific examples in which recommended operation sequence information is automatically set on an editing screen displayed on the display unit.
  • (A)-(B) is a figure which shows the specific example in which operation sequence information is shared in the edit screen displayed on a display part. It is a figure which shows the other specific example of the edit screen displayed on a display part.
  • Embodiment A game system is a system for providing a game to a plurality of users.
  • the game system will be described with reference to the drawings. It should be noted that the present invention is not limited to these exemplifications, but is defined by the scope of claims, and is intended to include all modifications within the meaning and scope equivalent to the scope of claims. The In the following description, the same reference numerals are given to the same elements in the description of the drawings, and repeated description is not repeated.
  • FIG. 1 is a diagram illustrating a hardware configuration of the game system 1.
  • the game system 1 includes a plurality of user terminals 100 and a server 200 as illustrated. Each user terminal 100 is connected to the server 200 via the network 2.
  • the network 2 includes various mobile communication systems constructed by the Internet and a radio base station (not shown). Examples of the mobile communication system include a so-called 3G and 4G mobile communication system, LTE (Long Term Evolution), and a wireless network (for example, Wi-Fi (registered trademark)) that can be connected to the Internet through a predetermined access point. .
  • 3G and 4G mobile communication system LTE (Long Term Evolution)
  • Wi-Fi registered trademark
  • the server 200 (computer, information processing apparatus) may be a general-purpose computer such as a workstation or a personal computer.
  • the server 200 includes a processor 20, a memory 21, a storage 22, a communication IF 23, and an input / output IF 24. These components included in the server 200 are electrically connected to each other via a communication bus.
  • the user terminal 100 may be a mobile terminal such as a smartphone, a feature phone, a PDA (Personal Digital Assistant), or a tablet computer.
  • the user terminal 100 may be a game device suitable for game play.
  • the user terminal 100 includes a processor 10, a memory 11, a storage 12, a communication interface (IF) 13, an input / output IF 14, a touch screen 15 (display unit), a camera 17, and a distance measuring sensor 18. With.
  • IF communication interface
  • touch screen 15 display unit
  • a camera 17 a camera 17
  • a distance measuring sensor 18 With.
  • These components included in the user terminal 100 are electrically connected to each other via a communication bus.
  • the user terminal 100 may be configured to be able to communicate with one or more controllers 1020.
  • the controller 1020 establishes communication with the user terminal 100 in accordance with a communication standard such as Bluetooth (registered trademark).
  • the controller 1020 may have one or more buttons and the like, and transmits an output value based on a user input operation to the buttons and the like to the user terminal 100.
  • the controller 1020 may include various sensors such as an acceleration sensor and an angular velocity sensor, and transmits output values of the various sensors to the user terminal 100.
  • controller 1020 may include the camera 17 and the distance measuring sensor 18 instead of or in addition to the user terminal 100 including the camera 17 and the distance measuring sensor 18.
  • the user terminal 100 desirably allows a user who uses the controller 1020 to input user identification information such as the user's name or login ID via the controller 1020 at the start of the game, for example.
  • the user terminal 100 can associate the controller 1020 with the user, and identifies which user the output value belongs to based on the transmission source (controller 1020) of the received output value. be able to.
  • each user grasps each controller 1020, so that the one user terminal 100 does not communicate with other devices such as the server 200 via the network 2.
  • Multi-play can be realized.
  • each user terminal 100 is connected to each other by a wireless standard such as a wireless LAN (Local Area Network) standard (communication connection is not performed via the server 200), so that a plurality of user terminals 100 can perform multiplayer locally.
  • a wireless standard such as a wireless LAN (Local Area Network) standard (communication connection is not performed via the server 200), so that a plurality of user terminals 100 can perform multiplayer locally.
  • the user terminal 100 may further include at least a part of various functions described later included in the server 200.
  • the plurality of user terminals 100 may be provided with various functions described later included in the server 200 in a distributed manner.
  • the user terminal 100 may communicate with the server 200 even when the above-described multiplayer is realized locally.
  • information indicating a play result such as a result or win / loss in a certain game may be associated with the user identification information and transmitted to the server 200.
  • the controller 1020 may be configured to be detachable from the user terminal 100.
  • a coupling portion with the controller 1020 may be provided on at least one surface of the housing of the user terminal 100.
  • the user terminal 100 may accept attachment of a storage medium 1030 such as an external memory card via the input / output IF 14. Accordingly, the user terminal 100 can read the program and data recorded in the storage medium 1030.
  • the program recorded in the storage medium 1030 is a game program, for example.
  • the user terminal 100 may store the game program acquired by communicating with an external device such as the server 200 in the memory 11 of the user terminal 100, or store the game program acquired by reading from the storage medium 1030 in the memory 11. May be stored.
  • the user terminal 100 includes the communication IF 13, the input / output IF 14, the touch screen 15, the camera 17, and the distance measuring sensor 18 as an example of a mechanism for inputting information to the user terminal 100.
  • Each of the above-described units serving as an input mechanism can be regarded as an operation unit configured to accept a user input operation.
  • the operation unit when the operation unit is configured by at least one of the camera 17 and the distance measuring sensor 18, the operation unit detects an object 1010 in the vicinity of the user terminal 100, and performs an input operation from the detection result of the object. Identify.
  • a user's hand as an object 1010, a marker having a predetermined shape, or the like is detected, and an input operation is specified based on the color, shape, movement, or type of the object 1010 obtained as a detection result.
  • the user terminal 100 when a user's hand is detected from a captured image of the camera 17, the user terminal 100 performs a user's input operation on a gesture (a series of movements of the user's hand) detected based on the captured image. Identify and accept as Note that the captured image may be a still image or a moving image.
  • the user terminal 100 identifies and accepts a user operation performed on the input unit 151 of the touch screen 15 as a user input operation. Or when an operation part is comprised by communication IF13, the user terminal 100 specifies and receives the signal (for example, output value) transmitted from the controller 1020 as a user's input operation.
  • the operation unit includes the input / output IF 14, a signal output from an input device (not shown) different from the controller 1020 connected to the input / output IF 14 is specified and accepted as a user input operation.
  • a game based on the game system 1 is a game in which a character operated by a user executes an action in response to the user's operation.
  • the character operated by the user is referred to as an operation character.
  • the game based on the game system 1 is a game that allows the operation character to execute a continuous action in response to a continuous operation on the touch screen 15 by the user.
  • An example of such a game is an action game in which a combo action is executed when an operation character battles an enemy character.
  • RPG Role-Playing Game
  • an RPG may be, for example, a MMORPG (Massively Multiplayer Online Role-Playing Game) in which a plurality of users simultaneously participate in one game space via their own user terminals.
  • MMORPG Massively Multiplayer Online Role-Playing Game
  • Such an MMORPG may be an open world game in which an operation character can freely move in a virtual game space, for example.
  • the game based on the game system 1 is not limited to these illustrated types of games.
  • the processor 10 controls the operation of the entire user terminal 100.
  • the processor 20 controls the operation of the entire server 200.
  • the processors 10 and 20 include a central processing unit (CPU), a micro processing unit (MPU), and a graphics processing unit (GPU).
  • the processor 10 reads a program from the storage 12 described later and expands it in the memory 11 described later.
  • the processor 20 reads a program from a storage 22 described later and develops it in a memory 21 described later.
  • the processor 10 and the processor 20 execute the developed program.
  • the memories 11 and 21 are main storage devices.
  • the memories 11 and 21 include storage devices such as a ROM (Read Only Memory) and a RAM (Random Access Memory).
  • the memory 11 provides a work area to the processor 10 by temporarily storing a program and various data read from the storage 12 described later by the processor 10.
  • the memory 11 also temporarily stores various data generated while the processor 10 is operating according to the program.
  • the memory 21 provides the work area to the processor 20 by temporarily storing various programs and data read from the storage 22 described later by the processor 20.
  • the memory 21 temporarily stores various data generated while the processor 20 is operating according to the program.
  • the program may be a game program for realizing the game by the user terminal 100.
  • the program may be a game program for realizing the game by cooperation between the user terminal 100 and the server 200.
  • the program may be a game program for realizing the game by cooperation of a plurality of user terminals 100.
  • the various data includes data related to the game such as user information and game information, and instructions and notifications transmitted and received between the user terminal 100 and the server 200 or between the plurality of user terminals 100.
  • Storage 12 and 22 are auxiliary storage devices.
  • the storages 12 and 22 are configured by a storage device such as a flash memory or an HDD (Hard Disk Drive).
  • Various data relating to the game is stored in the storage 12 and the storage 22.
  • the communication IF 13 controls transmission / reception of various data in the user terminal 100.
  • the communication IF 23 controls transmission / reception of various data in the server 200.
  • the communication IFs 13 and 23 control communication using, for example, communication via a wireless LAN (Local Area Network), Internet communication via a wired LAN, a wireless LAN, or a mobile phone network, and short-range wireless communication.
  • a wireless LAN Local Area Network
  • Internet communication via a wired LAN, a wireless LAN, or a mobile phone network
  • short-range wireless communication short-range wireless communication.
  • the input / output IF 14 is an interface for the user terminal 100 to accept data input, and is an interface for the user terminal 100 to output data.
  • the input / output IF 14 may input / output data via a USB (Universal Serial Bus) or the like.
  • the input / output IF 14 may include, for example, a physical button of the user terminal 100, a camera, a microphone, a speaker, or the like.
  • the input / output IF 24 of the server 200 is an interface for the server 200 to accept data input, and is an interface for the server 200 to output data.
  • the input / output IF 24 may include, for example, an input unit that is an information input device such as a mouse or a keyboard, and a display unit that is a device that displays and outputs an image.
  • the touch screen 15 of the user terminal 100 is an electronic component in which an input unit 151 and a display unit 152 are combined.
  • the input unit 151 is a touch-sensitive device, for example, and is configured by a touch pad, for example.
  • the display unit 152 is configured by, for example, a liquid crystal display or an organic EL (Electro-Luminescence) display.
  • the input unit 151 detects a position where a user operation (physical contact operation such as a touch operation, a slide operation, a swipe operation, and a tap operation) is input to the input surface, and inputs information indicating the position. A function of transmitting as a signal is provided.
  • the input unit 151 may include a touch sensing unit (not shown).
  • the touch sensing unit may adopt any method such as a capacitance method or a resistance film method.
  • the user terminal 100 may include one or more sensors for specifying the holding posture of the user terminal 100.
  • This sensor may be, for example, an acceleration sensor or an angular velocity sensor.
  • the processor 10 can specify the holding posture of the user terminal 100 from the output of the sensor and perform processing according to the holding posture.
  • the processor 10 may perform a portrait screen display in which a portrait image is displayed on the display unit 152.
  • a horizontal screen display in which a horizontally long image is displayed on the display unit may be used.
  • the processor 10 may be able to switch between the vertical screen display and the horizontal screen display according to the holding posture of the user terminal 100.
  • the camera 17 includes an image sensor and the like, and generates a captured image by converting incident light incident from the lens into an electric signal.
  • the distance measuring sensor 18 is a sensor that measures the distance to the measurement object.
  • the distance measuring sensor 18 includes, for example, a light source that emits pulse-converted light and a light receiving element that receives the light.
  • the distance measuring sensor 18 measures the distance to the measurement object based on the light emission timing from the light source and the light reception timing of the reflected light generated when the light emitted from the light source is reflected by the measurement object.
  • the distance measuring sensor 18 may include a light source that emits light having directivity.
  • the camera 17 and the distance measuring sensor 18 may be provided on the side surface of the housing of the user terminal 100, for example.
  • a distance measuring sensor 18 may be provided in the vicinity of the camera 17.
  • the camera 17 for example, an infrared camera can be used.
  • the camera 17 may be provided with an illumination device that emits infrared light, a filter that blocks visible light, and the like. Thereby, it is possible to further improve the detection accuracy of the object based on the captured image of the camera 17 regardless of whether it is outdoors or indoors.
  • the processor 10 may perform, for example, one or more of the following processes (1) to (5) on the image captured by the camera 17.
  • the processor 10 performs image recognition processing on the captured image of the camera 17 to identify whether the captured image includes a user's hand.
  • the processor 10 may use a technique such as pattern matching as an analysis technique employed in the above-described image recognition processing.
  • the processor 10 detects a user's gesture from the shape of a user's hand. For example, the processor 10 specifies the number of the user's fingers (the number of fingers extending) from the shape of the user's hand detected from the captured image.
  • the processor 10 further identifies a gesture performed by the user from the number of identified fingers.
  • the processor 10 determines that the user has performed a “par” gesture. In addition, when the number of fingers is 0 (no finger is detected), the processor 10 determines that the user has made a “goo” gesture. On the other hand, when the number of fingers is two, the processor 10 determines that the user has performed a “choke” gesture. (3) The processor 10 performs image recognition processing on the image captured by the camera 17 to detect whether the user's finger is in the state where only the index finger is raised or whether the user's finger has moved. .
  • the processor 10 determines an object 1010 (such as a user's hand) in the vicinity of the user terminal 100 based on at least one of the image recognition result of the image captured by the camera 17 and the output value of the distance measuring sensor 18. ) And the user terminal 100 are detected. For example, the processor 10 determines whether the user's hand is near (for example, a distance less than a predetermined value) or far (for example, a predetermined value) depending on the size of the shape of the user's hand specified from the captured image of the camera 17. It is detected whether the distance is above. When the captured image is a moving image, the processor 10 may detect whether the user's hand is approaching or moving away from the user terminal 100.
  • an object 1010 such as a user's hand
  • the processor 10 recognizes that the user is shaking his / her hand in the shooting direction of the camera 17.
  • the processor 10 indicates that the user is shaking his / her hand in a direction perpendicular to the shooting direction of the camera. recognize.
  • the processor 10 determines whether or not the user is grasping his / her hand by image recognition on the captured image of the camera 17 (“Goo” gesture or other gesture (eg “Par”)). ) Is detected.
  • the processor 10 also detects how the user is moving the hand along with the shape of the user's hand.
  • the processor 10 detects whether the user is approaching or moving away from the user terminal 100.
  • Such an operation can correspond to an operation using a pointing device such as a mouse or a touch panel.
  • the user terminal 100 moves the pointer on the touch screen 15 according to the movement of the user's hand, and detects the user's gesture “goo”. In this case, the user terminal 100 recognizes that the user is continuing the selection operation.
  • Continuation of the selection operation corresponds to, for example, maintaining a state where the mouse is clicked and pressed, or maintaining a touched state after a touchdown operation is performed on the touch panel. Further, when the user further moves his / her hand while the user's gesture “go” is detected, the user terminal 100 performs such a series of gestures as an operation corresponding to the swipe operation (or drag operation). It can also be recognized. In addition, when the user terminal 100 detects a gesture that the user repels a finger based on the detection result of the user's hand based on the captured image of the camera 17, the user terminal 100 clicks the gesture or taps the touch panel. You may recognize as operation corresponding to.
  • FIG. 2 is a diagram illustrating a functional configuration of the user terminal 100.
  • the user terminal 100 can function as the control unit 110 and the storage unit 120 through the cooperation of the processor 10, the memory 11, the storage 12, the communication IF 13, the input / output IF 14, and the like.
  • the game program stored in the storage unit 120 is developed on the main memory and executed by the control unit 110.
  • the control unit 110 can function as the input operation receiving unit 111, the camera arrangement control unit 112, the display control unit 113, and the object control unit 114 according to the game program.
  • various game data generated while the control unit 110 is operating according to the program and various game data used by the control unit 110 are also temporarily stored.
  • the storage unit 120 stores data necessary for the control unit 110 to function as each unit. Examples of the data include game programs, game information, and user information.
  • the game information includes an object management table, a skill management table, reference motion data 121, combo data 122, a history information table (described later), and a user management table.
  • the object management table is a table for managing various game objects.
  • the skill management table is a table for managing various skills of game characters.
  • the reference motion data 121 is a table for defining the motion of each game character.
  • the combo data 122 is a table for managing the contents (for example, attack power, motion, etc.) of the attacking action that is performed by the character by the combo generated under a predetermined condition.
  • the history information table is a table including a plurality of pieces of information indicating contact positions detected by the input unit 151.
  • the user management table is a table including a history of actions executed by the game character.
  • the control unit 110 controls the operation of the entire user terminal 100, and performs data transmission / reception between elements, arithmetic processing necessary for game execution, and other processing. For example, the control unit 110 develops a game according to the game program based on the input operation detected by the input operation reception unit 111, and draws a game image indicating the result.
  • the control unit 110 operates an object in the game space based on the user information received from the game server 200, the calculation result of the game program, and the input operation detected by the input operation receiving unit 111. Examples of the object include a character object (game character) and a target object (other object).
  • a character object is an object to be operated by a user in a game.
  • the target object is a target object on which the character object acts.
  • control unit 110 generates a captured image of a virtual camera that captures the game space. Further, the control unit 110 performs processing such as updating various data stored in the storage unit 120 based on the input operation on the touch screen 15 and the result of the arithmetic processing. In addition, the control unit 110 refers to various user information and game information stored in the storage unit 120 and executes various determinations necessary for the game progress.
  • the input operation reception unit 111 detects the type of user input operation on the input unit 151.
  • the input operation reception unit 111 determines what operation has been performed from an operation instruction or the like by the console via the input unit 151 and other input / output IFs 14.
  • the input operation reception unit 111 outputs the determined result to necessary elements such as the camera arrangement control unit 112. Examples of the types of input operations that are determined by the input operation receiving unit 111 include a first input operation and a second input operation.
  • the first input operation is an operation input by moving the touch position on the touch screen 15 from a certain position (first position) to a certain position (second position). Examples of the first input operation include a flick operation and a swipe operation.
  • the input operation receiving unit 111 moves an operation input by releasing the contact after moving the user from the first position to the touch screen 15 to a certain position in a time shorter than a predetermined time. "Flick operation”.
  • the input operation reception unit 111 moves the touch position on the touch screen 15 from a certain position (third position) to a certain position (fourth position), and then moves the touch position.
  • the operation (third input operation) input by maintaining the above may be a “swipe operation”.
  • the second input operation is an operation that is input without changing the touch position on the touch screen 15.
  • An example of the second input operation is a tap operation.
  • the input operation accepting unit 111 determines that the state is the “touch-on state” when the contact is detected from the state where the contact of the object with the touch screen 15 is not detected. In addition, the input operation reception unit 111 determines that the “touch-off state” has occurred when the contact of the object with the touch screen 15 is not detected. Further, the input operation accepting unit 111 sequentially accepts history information indicating a touch position on the touch screen 15 as history information of the “touch now state”.
  • FIG. 3 is a diagram illustrating an example of a history information table referred to by the input operation receiving unit 111 in order to detect the type of input operation.
  • history information indicating the position on the touch screen 15 detected by the input unit 151 is stored in each of the 11 arrays from array fp [0] to array fp [10].
  • the history information is stored in the history information table every predetermined period (for example, every frame rate).
  • the number of arrays in which history information is stored is not limited and may be any number.
  • it is preferable that history information detected when touch-on is performed from touch-off is configured in the storage unit 120 as initial position coordinates.
  • the accepting unit 111 determines that the input operation is a tap operation. For example, when the null value is stored after the history information has changed in the touch now state, the input operation reception unit 111 sets the array fp [3] and the array fp immediately before the array fp [5] in which the null value is stored. Refer to the history information stored in [4].
  • the input operation reception unit 111 determines that the input operation is a flick operation when the distance between the positions indicated by the history information of the arrays fp [3] and fp [4] is greater than or equal to a preset threshold value. Determine. Further, after the history information changes in the touch now state, for example, when the history information (x15, y15) is stored in the arrays fp [4] to fp [10], the input operation accepting unit 111 performs the swipe operation. Determine that there is.
  • the camera arrangement control unit 112 controls the arrangement of virtual cameras that shoot the game space.
  • the camera arrangement control unit 112 is a virtual camera for designating the visual field of the game space based on the user information stored in the storage unit 120, the calculation result of the game program, and the type of operation detected by the input operation receiving unit 111 Control the placement of
  • the camera arrangement control unit 112 preferably has a configuration in which a virtual camera is arranged so that a character object is displayed near the center.
  • the camera arrangement control unit 112 outputs the captured image of the virtual camera to the display control unit 113.
  • the display control unit 113 displays the captured image of the virtual camera on the display unit 152. Further, the display control unit 113 causes the display unit 152 to display the object in accordance with an instruction from the object control unit 114.
  • the object control unit 114 controls objects such as a character object and a target object based on an input operation by a user and / or a game program. For example, the object control unit 114 performs control for moving the character object, control for causing the character object to perform an action for acting on the target object, and the like. Further, the object control unit 114 instructs the display control unit 113 to display an object according to the input operation received by the input operation receiving unit 111 on the display unit 152.
  • FIG. 4 is an example of a flowchart showing a flow of processing executed by the game system 1.
  • FIG. 5 is a diagram schematically illustrating an example of a game space image displayed on the touch screen 15 by the display control unit 113.
  • an action to be executed by the character C for example, an action for acting on a target object, an action for moving, and the like can be mentioned.
  • Examples of the action include an attack.
  • an action for exerting an action on the target object for example, an attack action can be cited.
  • the target object examples include an enemy character object that performs an attacking action on the character C, and an obstacle object such as a door O shown in FIG.
  • the movement is an action for changing the position of the character C in the game space. Examples of the action for moving include walking, running, and avoiding action for avoiding attacking action by other objects or other objects.
  • step S101 when the input operation receiving unit 111 receives an operation to start the game, the control unit 110 starts the game.
  • the camera arrangement control unit 112 controls the arrangement of virtual cameras in the game space with reference to game information and the like.
  • step S ⁇ b> 102 the display control unit 113 displays the game space image captured by the virtual camera on the touch screen 15.
  • step S103 the input operation reception unit 111 determines whether an input operation on the touch screen 15 has been received. If NO in step S103, the input operation accepting unit 111 repeats the process in step S103.
  • step S104 the input operation accepting unit 111 determines whether or not the input operation accepted in step S103 is the first input operation.
  • the first position is the position L1
  • the second position is the position L2.
  • the display control unit 113 displays an object indicating the direction of the touch operation on the touch screen 15 in step S105.
  • the display control unit 113 may be configured to display an object or the like extending from the position L1 to the position L2 on the touch screen 15, as shown in FIG.
  • the display control unit 113 displays the elastic object E1 centered on the position L1 on the touch screen 15.
  • the display control unit 113 deforms the elastic object E1 so as to extend in the direction of the touch operation. E2 is displayed on the touch screen 15.
  • the object control unit 114 specifies the direction in which the character C is facing.
  • the storage unit 120 stores information indicating the direction in which the face of the character C is facing or the traveling direction of the last moving operation.
  • the object control unit 114 selects which of the directions indicated by the information is the direction in which the character C is facing based on the action performed by the character C, the action from another object, and the like.
  • the object control unit 114 refers to the storage unit 120 based on the selected result and identifies the direction in which the character C is facing.
  • step S107 the object control unit 114 compares the direction of the touch operation with the direction in which the character C is facing on the screen displayed on the touch screen 15.
  • step S108 the object control unit 114 causes the character C to perform an action corresponding to the comparison result in step S107.
  • the object control unit 114 causes the character C to perform an attack action.
  • the direction of the certain range is not particularly limited, but the direction in which the character C faces may be set as the axis and the direction may be a certain range on the left and right from the axis (for example, within 30 degrees, 45 degrees, etc. on the left and right respectively).
  • step S108 if the direction of the touch operation is not included in a certain range, the object control unit 114 causes the character C to perform an avoidance operation that moves in the direction of the touch operation, and does not execute an attack operation. After step S108, the process returns to step S103.
  • step S104 the object control unit 114 causes the character C to execute an action corresponding to the input operation in step S109.
  • the input operation is a tap operation
  • the object control unit 114 causes the character C to perform an attack action.
  • the attack action according to the tap operation is preferably different from the attack action executed by the character C in step S108. With this configuration, the character C can be made to perform various attack actions by various operations, so that the game preference is improved.
  • step S109 the process returns to step S103.
  • the object control unit 114 can divide actions such as causing the character C to perform an attacking action or an avoiding action depending on the direction in which the character C is facing and the direction of the touch operation. Since it can, the taste of the game is improved. In particular, it is possible to give diversity to the input operation for the attack target.
  • the display control unit 113 can visualize the direction of the touch operation by the process of step S105. Thereby, when the user wants the character C to perform an attacking action, the user can visually recognize both the direction in which the character C is facing and the direction of the touch operation. Therefore, the user can easily move the touch position in a desired direction while referring to the direction in which the character C is facing.
  • the flick operation is an operation that the user inputs intuitively and quickly in a scene that requires a quick operation during the game.
  • the attack action or the avoidance action can be selected according to the direction of the flick operation, the attack and the avoidance can be selected intuitively. Therefore, the game preference is improved.
  • Modification 1 Although the case where the first input operation is a flick operation has been described above, the same processing may be performed when the first input operation is a swipe operation. That is, when the input operation reception unit 111 receives a swipe operation in a direction included in a certain range of directions, the object control unit 114 may cause the character C to perform an attack action. Further, when the input operation accepting unit 111 accepts a swipe operation in a direction outside a certain range, the object control unit 114 may cause the character C to perform a moving action.
  • Modification 2 When the input operation receiving unit 111 receives the first input operation, the control unit 110 changes the action to be performed by the character C depending on whether the first input operation is a flick operation or a swipe operation. May be.
  • Step S105 when the operation received by the input operation receiving unit 111 is a flick operation, the control unit 110 executes Step S105 to Step S108.
  • the control unit 110 executes the following process instead of step S105 to step S108.
  • the display control unit 113 displays an object indicating the direction of the touch operation determined by the third position and the fourth position in the swipe operation on the touch screen 15. For example, when the third position and the fourth position are the position L1 and the position L2 shown in FIG. 5, respectively, the display control unit 113 displays an elastic object similar to the elastic object E2 shown in FIG. To do.
  • the object control unit 114 executes an action for moving the character C in the direction of the touch operation. For example, when the direction of the touch operation is the right direction, the object control unit 114 causes the character C to perform an action for moving in the right direction. Since the swipe operation is an operation that keeps touching the touch screen 15 for a long time, it is more intuitive for the user to move the operation content and the response of the character C than the attack operation and the avoidance operation in which the character C operates quickly. Easy to recognize. Therefore, the game preference can be further improved.
  • the object control unit 114 changes the direction in which the character C is facing to a direction determined based on the direction of the touch operation corresponding to the swipe operation. Thereby, the advancing direction in the last moving operation is the direction in which the character C is facing. It is considered that the user often recognizes that the traveling direction of the moving operation performed immediately before is the direction in which the character C is facing. For this reason, the direction in which the character C recognized by the user is facing and the direction in which the character C set by the object control unit 114 is facing match. Therefore, if the user performs an operation for making an attack or an operation for making another action based on the direction in which the character C is facing after the action for moving, an erroneous operation is performed. It can be made difficult to occur.
  • the display control unit 113 displays an object indicating the direction of the touch operation on the touch screen 15, the direction of the swipe operation can be visualized. Therefore, it is easy to move the character C in a desired direction. Therefore, usability can be improved.
  • the action that the object control unit 114 receives the current input operation and causes the character C to execute is at least a part of the previous input operation and It may be associated with the current input operation. That is, the object control unit 114 may cause the character C to perform a so-called combo attack operation.
  • the action that the object control unit 114 causes the character C to execute in response to the flick operation is the tap operation last time. And an action based on the fact that this time is a flick operation.
  • FIG. 6 is another example of a flowchart showing a flow of processing executed by the game system 1.
  • the processes in steps S101 to S107 and step S109 are the same as those described above, and thus description thereof will not be repeated.
  • the object control unit 114 After executing step S107, the object control unit 114 causes the character C to perform an action according to the comparison result. First, in step S201, the object control unit 114 determines whether the direction of the touch operation is included in a certain range of directions determined by the direction in which the character C is facing.
  • step S202 the object control unit 114 determines whether or not a tap operation for performing an attack operation has been received within a certain period before the flick operation is received in step S103.
  • step S203 the object control unit 114 performs an attack action based on the fact that the operation corresponding to the previous attack action is a tap operation and the current operation is a flick operation. To run. As described above, when the character C is continuously made to perform the attack action, the character C can be made to execute different attack actions instead of causing the character C to execute the same attack action every time. Therefore, the attacking action becomes diverse. After step S203, the process returns to step S103.
  • step S202 the object control unit 114 causes the character C to perform an attack action based on the flick operation without referring to the combo data 122 in step S204. After step S204, the process returns to step S103.
  • step S205 the object control unit 114 determines whether or not a tap operation has been received within a certain period before the flick operation received in step S103 is received.
  • step S206 the object control unit 114 causes the character C to execute an avoidance action for moving in the direction of the touch operation without executing an attack action.
  • step S207 the object control unit 114 causes the character C to perform an attack action.
  • the attack action that the object control unit 114 causes the character C to execute in step S207 may be different from the attack action based on the tap operation. In this case, the attacking action by the character C becomes more diverse, and the game preference is improved.
  • step S207 the process returns to step S103.
  • step S208 the object control unit 114 causes the character C to perform an avoidance action and not to perform an attack action. After step S208, the process returns to step S103.
  • the user can cause the character C to perform various actions according to the input operation before inputting the first input operation and the direction of the first input operation. improves.
  • step S109 the input operation receiving unit 111 may determine whether or not the input operation received in step S103 is a tap operation. Furthermore, when the input operation reception unit 111 determines that the received input operation is a tap operation, it may further determine whether or not the tap operation has been received within a certain period before the tap operation is received. After step S109, the process returns to step S103.
  • the input operation reception unit 111 receives a flick operation within a certain period, and steps S202 and S203 are executed, so that a series of inputs such as a tap operation, a tap operation, and a flick operation are performed.
  • a combo attack based on the operation is established. An action performed by the character C in this combo attack will be described with reference to FIG.
  • FIG. 7 is a diagram illustrating an example of an action when the object control unit 114 causes the character to perform an attack action based on the combo data 122.
  • the state (A) in FIG. 7 shows action a1.
  • the state (B) of FIG. 7 shows action a2.
  • the state (C) of FIG. 7 shows action a3.
  • the object control unit 114 first causes the character C shown in the state (A) of FIG. 7 to execute the action a1. Subsequently, when the input operation accepting unit 111 accepts the tap operation again within a certain period, the object control unit 114 causes the character C to execute the action a2 shown in the state (B) of FIG.
  • the object control unit 114 causes the character C to execute an action a3 shown in the state (C) of FIG.
  • the action a3 may be set so that the production is more flashy than the actions a1 and a2, or the attack power is increased.
  • the tap operation is continued three times, so that the object control unit 114 causes the character C to perform an action in which other effects are flashy or the attack power is large. May be. According to an embodiment of the present disclosure, even in a game in which the character C performs a combo attack by a touch operation using the touch screen 15, the flow of the combo can be branched.
  • control unit 110 returns to the process of step S103.
  • the object control unit 114 can cause the character C to execute a combo for the third and subsequent input operations. it can.
  • control unit 110 may execute the processes after step S103 in parallel in the action execution period after the character C starts the action of attack action and before the action is ended. For example, when the input operation reception unit 111 receives a flick operation during at least a part of the period in which the object control unit 114 executes step S203, the control unit 110 executes the processing after step S103. Then, the control unit 110 refers to the combo data 122 and causes the storage unit 120 to store an action corresponding to the flick operation. After completing the process of step S203, the object control unit 114 reads the action from the storage unit 120 and causes the character C to execute it. Even with this configuration, the object control unit 114 can cause the character C to perform a combo.
  • control unit 110 may display the number of combos on the touch screen 15 by causing the character C to perform combos.
  • the control unit 110 may count up the number of times that the character C continuously executes the action each time the action is performed, and display the count value as the number of combos on the touch screen 15.
  • FIG. 8 is a block diagram showing functional configurations of the server 200 and the user terminal 100 included in the game system 1.
  • the user terminal 100 has a function as an input device that receives user input operations and a function as an output device that outputs game images and sounds.
  • the user terminal 100 functions as the control unit 110 and the storage unit 120 through cooperation of the processor 10, the memory 11, the storage 12, the communication IF 13, the input / output IF 14, and the like.
  • the server 200 communicates with each user terminal 100 and has a function of supporting the user terminal 100 to advance the game. For example, sales of valuable data and provision of services are executed.
  • the server 200 may have a function of communicating with each user terminal 100 participating in the game and mediating exchanges between the user terminals 100.
  • the server 200 functions as the control unit 210 and the storage unit 220 through the cooperation of the processor 20, the memory 21, the storage 22, the communication IF 23, the input / output IF 24, and the like.
  • the storage unit 120-1 and the storage unit 220 store a game program 131, game information 132, and user information 133.
  • the game program 131 is a game program that is executed by the user terminal 100 and the server 200.
  • the game information 132 is data referred to when the control unit 110-1 and the control unit 210 execute the game program 131.
  • the game information 132 is information common to a plurality of users, for example, (1) information for defining the game space, (2) basic parameters for each character, (3) basic parameters for each action, ( 4) It may include initial information, recommended information, and the like to be presented on an editing screen described later.
  • the game space is a space where operation characters and various objects related to the game are arranged.
  • the game information 132 may include information related to various events performed in the game space.
  • User information 133 is data related to the user's account.
  • the user information 133 is associated with an identifier of the user's account, (1) information indicating the user of the account, (2) information regarding the retained character held by the account, and (3) an action for which each retained character has been acquired. (4) information indicating the progress of the game of the account, (5) information regarding assets held by the account, and the like. Examples of the account assets include virtual currency, items, and equipment in the game.
  • the user information 133 includes various information managed for each account.
  • user information 133 is stored for each user terminal 100.
  • the control unit 210 performs overall control of the server 200 by executing the game program 131 stored in the storage unit 220. For example, the control unit 210 transmits various data and programs to the user terminal 100. The control unit 210 receives part or all of the game information or user information from the user terminal 100. When the game is a multiplayer game, the control unit 210 may receive a multiplayer synchronization request from the user terminal 100 and transmit data for synchronization to the user terminal 100. In addition, the control unit 210 has various functions according to the nature of the game to be executed in order to support the progress of the game on the user terminal 100.
  • the control unit 110-1 performs overall control of the user terminal 100 by executing the game program 131 stored in the storage unit 120-1. For example, the control unit 110-1 advances the game according to the game program 131 and the user's operation. In addition, the control unit 110-1 communicates with the server 200 and transmits / receives information as necessary while the game is in progress.
  • the control unit 110-1 includes an operation reception unit 111-1, a display control unit 112-1, a user interface (hereinafter referred to as UI) control unit 113-1, an animation generation unit 114-1, It functions as a game execution unit 115, a camera arrangement control unit 116, an edit screen generation unit 117, and a continuous operation determination unit 118.
  • the control unit 110-1 can also function as other functional blocks (not shown) in order to advance the game according to the nature of the game to be executed.
  • the operation accepting unit 111-1 detects and accepts a user input operation on the input unit 151.
  • the operation accepting unit 111-1 determines what input operation has been performed from the action exerted by the user on the console via the touch screen 15 and other input / output IF 14, and the result is determined by the control unit 110-1. Output to each element.
  • the operation receiving unit 111-1 receives an input operation on the input unit 151, detects the coordinates of the input position of the input operation, and identifies the type of the input operation.
  • the input operation is an operation on the touch screen 15.
  • the operation reception unit 111-1 identifies, for example, a touch operation, a slide operation, a swipe operation, a tap operation, and the like as the types of operations on the touch screen 15. Further, the operation receiving unit 111-1 detects that the touch input is canceled from the touch screen 15 when the input that has been continuously detected is interrupted. In the present embodiment, details of the types of operations specified by the operation reception unit 111-1 will be described later.
  • the UI control unit 113-1 controls UI objects to be displayed on the display unit 152 in order to construct a UI.
  • the UI object is a tool for the user to make an input necessary for the progress of the game to the user terminal 100 or a tool for obtaining information output during the progress of the game from the user terminal 100.
  • the UI object is not limited to this, but is, for example, an icon, a button, a list, a menu screen, or the like.
  • the game execution unit 115 advances the game by causing an operation character existing in the game space to execute an action based on the user's operation.
  • the game progresses when the operation character battles with the enemy character. It is assumed that the operation character executes an action in a battle with the enemy character to reduce the physical strength of the enemy character, and wins the battle when the physical strength of the enemy character is lost.
  • the content of the action to be executed by the operation character is determined by the continuous operation determination unit 118 described later. Examples of the action type include, but are not limited to, an action that acts on an enemy character, a movement of the operation character itself, a summoning of other possessed characters to the game space, and the like.
  • examples of the action that acts on the enemy character include, but are not limited to, an attack on the enemy character and an action that changes the attribute of the enemy character.
  • examples of the attack on the enemy character include a technique of blowing the enemy character for a predetermined distance and a technique of making the player character move for a predetermined period of time, but are not limited thereto.
  • examples of the attribute of the enemy character to be changed include, but are not limited to, attack power, defense power, accuracy, moving speed, and the like.
  • the game execution unit 115 notifies the determined action to the animation generation unit 114-1, which will be described later.
  • the game execution unit 115 notifies the later-described camera arrangement control unit 116 of the position or direction when there is a change in the position or direction of the operation character in the game space.
  • the game execution unit 115 executes various processes for advancing the game while communicating with the server 200 as necessary.
  • the camera arrangement control unit 116 defines a virtual camera for designating an area to be presented to the user in the game space.
  • the camera placement control unit 116 virtually places the virtual camera in the game space by defining the position and orientation of the virtual camera in the game space.
  • the camera arrangement control unit 116 instructs the display control unit 112-1 described later to create an image in which a field of view defined by the virtual camera and an object arranged in the field of view are drawn.
  • the camera arrangement control unit 116 may arrange the virtual camera behind the operation character based on the position of the operation character notified from the game execution unit 115.
  • the camera arrangement control unit 116 causes the orientation of the virtual camera to be the direction in which the operation character notified from the game execution unit 115 is facing.
  • the control of the placement and orientation of the virtual camera is not necessarily the control described above.
  • the edit screen generation unit 117 generates an edit screen for associating actions corresponding to the order in the operation sequence of the operations with respect to each operation included in the operation sequence including the continuous operations.
  • one operation sequence is a sequence of continuous operations of the same type.
  • information in which an action corresponding to an order is associated with each operation included in the operation sequence is also referred to as operation sequence information.
  • associating an action corresponding to an order with each operation is also referred to as editing operation sequence information. Details of the edit screen generation unit 117 will be described later.
  • the continuous operation determination unit 118 determines whether or not the operation on the touch screen 15 is continued from the previous operation. In addition, when the continuous operation determination unit 118 determines that the current operation is continued from the previous operation, the action to be executed by the operation character is determined according to the current operation by referring to the operation sequence information. To do. Details of the continuous operation determination unit 118 will be described later.
  • the animation generation unit 114-1 generates an animation indicating the motion of various objects based on the control mode of the various objects. For example, the animation generation unit 114-1 generates an animation that expresses how the operating character performs an action. If the action is an attack on an enemy character, the animation generation unit 114-1 performs an animation that represents the preparatory motion of the operation character before the attack, an animation that represents the attack motion, and an animation that represents the return motion after the attack. Generate. In addition, for example, the animation generation unit 114-1 generates an animation representing a state in which the enemy character is acted upon by the action of the operation character (a beat motion).
  • the display control unit 112-1 outputs a game screen reflecting the processing results executed by the above-described elements to the display unit 152 of the touch screen 15. For example, the display control unit 112-1 generates a game screen in which a field of view of the virtual camera defined by the camera placement control unit 116 and objects existing in the region are drawn in the game space. Further, the display control unit 112-1 outputs a game screen including the animation generated by the animation generation unit 114-1. In addition, the display control unit 112-1 may draw the UI object described above by superimposing it on the game screen.
  • the server 200 may include at least a part of functions included in the user terminal 100.
  • the user terminal 100 may include at least a part of the functions included in the server 200.
  • a device other than the user terminal 100 and the server 200 may be used as a component of the game system 1, and the other device may be caused to execute part of the processing in the game system 1. That is, the computer that executes the game program in the present embodiment may be any of the user terminal 100, the server 200, and other devices, or may be realized by a combination of these devices.
  • the operation receiving unit 111-1 Details of the types of operations specified by the operation receiving unit 111-1 will be described. In the present embodiment, it is assumed that the operation types include a tap operation, an upper flick operation, a lower flick operation, a left / right flick operation, and a rotation operation. A known method can be applied to the tap operation detection method.
  • the up, down, left and right directions can be arbitrarily determined on the display unit 152.
  • the direction in which the character is facing in the game space displayed on the display unit 152 may be the upward direction.
  • the opposite direction of the upward direction can be defined as the downward direction.
  • a direction obtained by rotating the upward direction 90 degrees to the left (or right) can be determined as the left (or right) direction.
  • the operation receiving unit 111-1 detects the flick operation as an upper flick operation when the direction of the flick operation is included in a range of a predetermined angle with respect to the upper direction. Thereby, a flick operation in the direction in which the operation character is facing is detected as an upper flick operation. Further, the operation reception unit 111-1 detects a flick operation as a lower flick operation when the direction of the flick operation is included in a range of a predetermined angle with respect to the lower direction. Thereby, the flick operation toward the back of the operation character is detected as the lower flick operation. Further, the operation reception unit 111-1 detects a flick operation as a left / right flick operation when the direction of the flick operation is included in a predetermined angle range with respect to the left (or right) direction. As a result, the flick operation toward the horizontal direction of the operation character is detected as a left / right flick operation.
  • the rotation operation is an operation in which the locus of the contact position (that is, the touch position) of the object 1010 in the drag operation becomes a ring shape or a substantially ring shape.
  • the ring drawn by the locus of the rotation operation does not have to be closed.
  • the start position and the end position of the drag operation in the rotation operation do not have to match.
  • the rotation operation can be specified by detecting a change in the vector component of the drag operation during a predetermined period. Specifically, if the drag operation continues for a predetermined period after detecting the contact of the object 1010 with the input unit 151, the operation receiving unit 111-1 detects a change in the vector component in the operation direction. The change in the vector component in the operation direction will be described with reference to FIG. FIG. 9 is a schematic diagram illustrating a change in the vector component in the operation direction in the rotation operation.
  • the operation direction is represented by a set of an x component along the x axis and a y component along the y axis when an arbitrary orthogonal coordinate system is defined on the plane of the touch screen 15.
  • the positive x component is described as “x axis positive direction”.
  • the negative x component is described as “x-axis negative direction”.
  • the positive y component is referred to as “y-axis positive direction”.
  • the negative y component is described as “y-axis negative direction”.
  • the vector component in the operation direction is (1) (x-axis negative direction, y-axis positive direction), (2) (x-axis positive direction, y-axis (Positive direction), (3) (x-axis positive direction, y-axis negative direction), (4) (x-axis negative direction, y-axis negative direction).
  • the operation reception unit 111-1 determines that a rotation operation has been detected.
  • the rotation operation in this case is a clockwise operation. Also, for example, as shown in FIG.
  • the vector component in the operation direction is (1) (x-axis positive direction, y-axis positive direction), (2) (x-axis negative direction, The y-axis positive direction), (3) (x-axis negative direction, y-axis negative direction), and (4) (x-axis positive direction, y-axis negative direction) are assumed to change in this order.
  • the operation reception unit 111-1 determines that a rotation operation has been detected.
  • the rotation operation is a counterclockwise operation.
  • the operation accepting unit 111-1 may store a vector component change pattern representing a rotation operation in advance and determine that the rotation operation has been detected when a corresponding pattern is detected. Note that the corresponding change pattern is not limited to the above-described example. Further, the detection method of the rotation operation is not limited to the above-described method, and other methods can be applied.
  • the edit screen generated by the edit screen generation unit 117 will be described.
  • the edit screen is a screen for editing operation sequence information.
  • the action associated with the operation of order i is also referred to as the i-th action.
  • the operation sequence information is referred to when the continuous operation determination unit 118, which will be described in detail later, determines that the operation is continuous.
  • another type of operation sequence information is referred to following the certain type of operation sequence information. For this reason, the user can edit the actions in consideration of the combination of each type of operation sequence information, so that the actions that are continuously fed out by successive operations can be more varied.
  • the edit screen generation unit 117 generates an edit screen for each retained character held by the user.
  • the possessed character is a character that can be an operation character when selected by the user.
  • As the possessed character there is a character given to the user in advance at the start of the game.
  • As the possessed character there is a character that is additionally given as the game progresses.
  • the possessed character may be given when the operation character wins a battle with the enemy character.
  • the possessed character to be additionally given may be determined by lottery from one or more characters.
  • the possessed character that is additionally given may appear as an inclusion by opening an unopened thing that is given to the user as the game progresses. In this case, the unopened object may be opened by satisfying a predetermined condition.
  • an unopened thing is expressed as an egg provided when an operation character wins a battle.
  • such an egg may be set in a virtual hatching device and hatched when a predetermined condition is satisfied, so that a character appears.
  • filled may be shortened in exchange for consumption items, such as virtual currency.
  • an unopened thing may be opened instead of satisfy
  • the editing screen includes an operation sequence component 115a for each type of operation and a list of action icons 115b.
  • operation sequence components 115a of five types of operations (tap operation, upper flick operation, lower flick operation, left and right flick operation, and rotation operation) are included.
  • the number of operation sequence components 115a included in the editing screen is not limited to the illustrated number. If the number of action icons 115b that can be displayed exceeds the number that can be displayed simultaneously (six in FIGS. 10A and 10B), the scroll bar 115c is displayed. Of the action icons 115b that can be displayed, the hidden action icon 115b is displayed by an operation on the scroll bar 115c.
  • the operation sequence component 115a sequentially assigns slots 115ai (i is a natural number from 1 to N) representing the actions associated with the operation of the order i among the corresponding types of consecutive operations from the first stage to the Nth stage. It is an arranged component.
  • N is an integer equal to or greater than 1, and represents the maximum number of operations that can be included in the type of operation sequence.
  • the maximum number N is an upper limit value of the continuous number that allows the action to be continuously sent out by the continuous operation of the type.
  • the maximum number N is also referred to as the maximum continuous number N.
  • the maximum continuation number N may be different depending on the characteristics of the corresponding possessed character, for example.
  • the maximum continuous number N may be different depending on the characteristics of the type of the corresponding operation. Further, the maximum number N of consecutive times can be increased when a predetermined condition is satisfied.
  • An example of the condition for increasing the maximum number N is exchange for a consumption item such as virtual currency.
  • the possessed character may be leveled up.
  • the conditions for increasing the maximum continuous number N are not limited to these.
  • actions can be associated with the slots 115ai up to the editable number M (that is, editable).
  • the slots 115ai exceeding the editable number M are not editable and cannot be associated with actions.
  • the non-editable slot 115ai is displayed to indicate that it cannot be edited. In the examples of FIGS. 10A and 10B, the slot 115ai in which the lock mark is displayed indicates that editing is impossible.
  • the number of continuous actions that can be continuously sent out by this type of continuous operation is the editable number M.
  • the editable number M can be increased up to the maximum continuous number N according to the conditions satisfied by the possessed character.
  • the editable number M may be different depending on the possessed character.
  • condition for increasing the editable number M for example, exchange with a consumption item such as a virtual currency can be cited.
  • a condition for increasing the editable number M for example, the possessed character is raised in level.
  • the conditions for increasing the editable number M are not limited to these.
  • the action icon 115b represents an action that can be associated with each operation in the operation sequence information.
  • the types of actions that can be associated include those that act on enemy characters, those that move operation characters, and those that summon other possessed characters.
  • the action represented by the action icon 115b for example, various information such as an action name, an action characteristic, and an action parameter are defined. Information regarding each of these actions may be displayed in the vicinity of the action icon 115b.
  • the characteristic of the action named “A1” is “xx”
  • the cost is “yy”.
  • An example of an action parameter is cost. Details of the cost will be described later.
  • the information defined in the action represented by the action icon 115b is not limited to the information described above. Further, information displayed in the vicinity of the action icon 115b is not limited to such information.
  • the action icon 115b is displayed as an icon including the name of the corresponding action, but the display form of the action icon 115b is not limited to this.
  • the action icon 115b may be displayed as a design icon representing the corresponding action.
  • the list of action icons 115b represents a list of actions that can be associated with the operation corresponding to the slot 115ai. Actions that can be associated include actions already acquired by the corresponding possessed character. Furthermore, the actions that can be associated with each other may include actions that have already been acquired by another possessed character of the same type as the corresponding retained character.
  • the possessed character may acquire a new action by satisfying the condition.
  • the condition for acquiring a new action may be, for example, that the play period as the operation character of the corresponding possessed character exceeds a threshold value. This represents that the operation character flashes a new technique as the game progresses.
  • the condition for acquiring the action may be, for example, that the operation character acquires a predetermined item. This expresses that the operating character learns a new technique using a predetermined item.
  • the list of action icons 115b represents a list of actions having characteristics corresponding to the type of operation corresponding to the slot 115ai to be edited.
  • the characteristic corresponding to the tap operation may be high homing performance.
  • the characteristic corresponding to the upper flick operation or the lower flick operation may be a high attack power.
  • the characteristic corresponding to the left / right flick may be that the attack range is wide.
  • the homing performance of the action that can be associated with the tap operation may be increased and the homing performance of the action that may be associated with the rotation operation may be decreased as compared with the up / down or left / right flick operation.
  • the action homing performance that can be associated with the left / right flick operation may be higher than that of the up / down flick operation.
  • FIG. 10A a list of action icons 115b representing actions A1 to A6 having characteristics according to the tap operation when any slot 115ai included in the operation column component 115a of the tap operation is tapped. Is displayed.
  • FIG. 10B when any slot 115ai included in the operation sequence component 115a of the upper flick operation is tapped, action icons 115b representing actions B1 to B6 having characteristics corresponding to the upper flick operation. Is displayed.
  • An action icon 115b can be fitted into the editable slot 115ai.
  • the action represented by the inserted action icon 115b is associated with the operation in the order of the slots 115ai.
  • the operation for fitting the action icon 115b into the slot 115ai may be, for example, a drag operation.
  • the operation for fitting may be tapping the slot 115ai and the action icon 115b in this order or in reverse order.
  • the operation for fitting is not limited to these operations.
  • the action icon 115b inserted in the slot 115ai can be cleared by a predetermined operation to become an empty slot.
  • the operation for clearing may be, for example, that the action icon 115b already fitted is dragged outside the frame of the slot 115ai.
  • the clearing operation is not limited to such an operation.
  • an “all clear” function may be provided in which each slot 115ai in each type of operation sequence component 115a is an empty slot.
  • the editing screen includes an operation button (not shown) that instructs all clear.
  • the slot 115ai is displayed to indicate that it is an empty slot if the actions in the corresponding order are not associated.
  • the slot 115ai filled with diagonal lines is an empty slot.
  • slots 115ai are arranged. That is, the maximum continuous number N is 5. Further, slots 115a1 to 115a4 can be edited, of which slots 115a2 to 115a4 are empty slots. The slot 115a5 is not editable. That is, the editable number M is four. In other words, in this example, for this possessed character, it is possible to execute a continuous action up to the fourth editable number of the maximum number of continuous tap operations 5.
  • the total cost of actions associated with each operation in each operation sequence information is determined not to exceed the upper limit of the cost. For example, whether or not the total cost of the action icons 115b fitted in the slots 115ai in each type of operation sequence component 115a exceeds the upper limit value is determined when the editing of the operation sequence information is finished. For example, when the operation for instructing the end of editing is performed, the editing screen generation unit 117 stores operation sequence information reflecting the editing content as user information 133 if the total cost does not exceed the upper limit value. Save to 120. If the total cost exceeds the upper limit value, the edit screen generation unit 117 does not save the operation sequence information. In this case, the edit screen generation unit 117 may continue displaying the edit screen after displaying that the total cost exceeds the upper limit.
  • the continuous operation determination unit 118 determines whether a certain operation is continuous with the previous operation. “Continuing from the previous operation” means that the current operation was performed within a certain period of time after a series of motions of the operation character based on the previous operation started.
  • a period during which the operation character is performing a preparation motion for an attack based on the previous operation may be used.
  • it may be until the return motion is completed, or the operation may be accepted within a predetermined time after the return motion is completed. For example, a certain operation may be performed before a series of motions of the operation character based on the previous operation is completed.
  • the continuous operation determination unit 118 may determine whether or not the current operation has been performed within a certain period after a series of motions of the operation character based on the previous operation has started. For example, when an operation is performed while an animation representing a preparatory motion before the attack of the operation character based on the previous operation is being performed, the continuous operation determination unit 118 determines that the operation is the previous operation. It may be determined that it is continuous. In addition, when a certain operation is performed between the end of the return motion of the operation character based on the previous operation and the elapse of a certain period, the continuous operation determination unit 118 determines that the certain operation is the previous operation. It may be determined that it is continuous.
  • the continuous operation determination unit 118 determines an action to be executed with reference to operation sequence information according to the type of the previous operation and the current operation.
  • the continuous operation determination unit 118 notifies the game execution unit 115 of the determined action.
  • the continuous operation determination unit 118 counts the number of the operation of the current type (that is, the order i). The count of the order i can be calculated by storing the type of operation as a history every time it is determined that it is continuous. Note that the continuous operation determination unit 118 may clear the history of operation types when it is determined that the history is not continuous. Then, the continuous operation determination unit 118 refers to the operation sequence information regarding the type of the current operation, and determines the action associated with the operation of the current order i as the action to be executed next. A case where the current order i exceeds the editable number M in the operation sequence information will be described.
  • this embodiment it is assumed that the continuous operation determination unit 118 does not determine an action to be executed next, assuming that no further continuous operation is possible. That is, the same type of operation becomes invalid after the continuous number exceeds the editable number M until it is determined that the operation is not continuous with the previous operation. Thereby, this embodiment can urge the user to perform an operation in which a plurality of types of operation sequences are connected as much as possible.
  • the process when the continuous number exceeds the editable number M is not limited to the process for invalidating the operation as described above.
  • the continuous operation determination unit 118 may reset the order to 1 and determine the first-stage action in the operation sequence information as an action to be executed next. In this case, if the user performs the same type of operation continuously, it is possible to send out actions successively one after another by the number of continuous numbers without any upper limit.
  • the continuous operation determining unit 118 refers to the operation sequence information regarding the type of operation this time, and determines the first-stage action associated with the operation of the order i as the action to be executed next.
  • the game execution unit 115 When the action determined by the continuous operation determination unit 118 is notified, the game execution unit 115 notifies the other units to cause the operation character to execute the action based on the current operation following the action based on the previous operation. To do.
  • the action based on the previous operation and the action based on the current operation are actions that affect the enemy character
  • the action that acts on the enemy character includes a preparation motion, an attack motion, and a return motion, as described above.
  • the current operation is during the technique motion display period based on the previous operation.
  • the following two patterns can be considered as timings at which the operation character starts the action based on the current operation.
  • the game execution unit 115 causes the operation character to end the technique motion by the previous operation, cancels the subsequent return motion, and starts the preparation motion by the current operation.
  • the game execution unit 115 causes the operation character to cancel the technique motion by the previous operation and immediately starts the preparation motion by the current operation.
  • the game execution unit 115 can cause the operation character to continuously execute the action determined based on the operation sequence information according to the continuous operation.
  • FIG. 11 is a flowchart showing the flow of the operation sequence information editing process. It is assumed that the game screen has already been displayed on the display unit 152 at the start of the following operation. Further, it is assumed that the displayed game screen includes a menu or an operation button for instructing start of editing of operation sequence information. Further, in the following description, displaying the processing result of the editing screen generation unit 117 by the display control unit 112-1 is simply referred to as displaying by the editing screen generation unit 117.
  • step S 101-1 the operation reception unit 111-1 determines whether an operation for instructing to start editing operation sequence information has been received for a certain retained character.
  • the target possessed character may be the operation character selected at this time, or may be another possessed character.
  • step S102-1 the edit screen generation unit 117 reads operation sequence information related to the possessed character from the storage unit 120.
  • the operation sequence information read at this time is operation sequence information that has already been edited and stored, or operation sequence information in an initial state. In the operation column information in the initial state, it is assumed that a predetermined action is associated with each operation.
  • step S103-1 the edit screen generation unit 117 displays an edit screen based on the operation string information of the possessed character.
  • step S104-1 the operation reception unit 111-1 receives an operation for one of the slots 115ai.
  • an operation for one of the slots 115ai For example, a tap operation or a drag operation on the slot 115ai may be accepted.
  • the slot 115ai in which the operation has been accepted is also referred to as an operation target slot 115ai.
  • step S105-1 the edit screen generation unit 117 branches the process depending on the state of the slot 115ai for which the operation has been accepted.
  • step S106-1 the edit screen generation unit 117 displays a notification that the corresponding slot 115ai cannot be edited.
  • step S105-1 a case where the action icon 115b has already been inserted in the operation target slot 115ai in step S105-1 will be described.
  • the operation accepted in step S104-1 is an operation to instruct clearing of the slot 115ai (Yes in step S107-1).
  • step S108-1 the edit screen generation unit 117 clears the action icon 115b from the corresponding slot 115ai and displays it as an empty slot.
  • step S109-1 the edit screen generation unit 117 displays a list of action icons 115b corresponding to the slot 115ai.
  • the action icon 115b included in the list represents an action having a characteristic corresponding to the type of operation corresponding to the slot 115ai among the actions already acquired by the corresponding owned character and other owned characters of the same type.
  • step S110-1 the operation receiving unit 111-1 receives an operation of dragging any one of the list of action icons 115b to the slot 115ai. It is assumed that the slot 115ai effective as a drag destination is any slot 115ai included in the same operation sequence component 115a as the operation target slot 115ai operated in step S104-1.
  • step S111-1 the edit screen generation unit 117 displays an image in which the dragged action icon 115b is inserted in the drag-destination slot 115ai.
  • step S112-1 If it is determined in step S112-1 that an operation for instructing the end of editing has not been accepted, the edit screen generation unit 117 repeats the operation from step S104-1.
  • step S113-1 the editing screen generation unit 117 determines whether or not the total cost exceeds the upper limit value. Specifically, the edit screen generation unit 117 calculates the total cost of the action icon 115b fitted in each slot 115ai included in each type of operation sequence component 115a, and whether or not the calculated value exceeds the upper limit value. You just have to judge.
  • step S113-1 when the total cost exceeds the upper limit value, the edit screen generation unit 117 displays a notification indicating that effect. Then, the edit screen generation unit 117 repeats the operation from step S104-1.
  • step S114-1 the edit screen generation unit 117 stores operation sequence information reflecting the edit contents in each operation sequence component 115a in the storage unit 120-1.
  • step S115-1 the display control unit 112-1 closes the editing screen and displays the game screen.
  • 12A, 12B, and 12C are diagrams illustrating examples of the edit screen.
  • the editing screen shown in FIG. 12A is displayed when an instruction to start editing operation sequence information is given for a certain retained character (step S103-1).
  • This edit screen includes an operation sequence component 115a for each type of operation. For example, in the operation column component 115a for the tap operation, action icons 115b representing actions A3, A6, A1, and A5 are fitted in the slots 115a1 to 115a4. In addition, a key mark is displayed in the slot 115a5 to indicate that editing is impossible.
  • the edit screen shown in FIG. 12B is displayed when the slot 115a2 included in the operation column component 115a of the tap operation is dragged out of the frame on the edit screen of FIG. “There is an action” in S104-1, S105-1, Yes in S107-1, S108-1). That is, as a result of this operation, the slot 115a2 for the tap operation becomes an empty slot. In addition, since the slot 115a2 which has become an empty slot is selected, a list of action icons 115b corresponding to the tap operation is additionally displayed (“empty slot” in step S105-1, S109-1).
  • the edit screen shown in FIG. 12C is displayed when the action icon 115b representing the action A4 is dragged to the slot 115a2 for the tap operation on the edit screen shown in FIG. That is, by this drag operation, the action icon 115b representing the action A4 is inserted into the tap operation slot 115a2 (steps S110-1 and S111-1).
  • each type of operation sequence information is saved as follows. That is, actions A3, A4, A1, and A5 are associated with the operations in the first to fourth stages in the operation sequence information of the tap operation.
  • actions A3, A4, A1, and A5 are associated with the operations in the first to fourth stages in the operation sequence information of the tap operation.
  • the action C3 is associated with the first-stage operation.
  • No action is associated with the operation sequence information of the lower flick.
  • the action E3 is associated with the first-stage operation.
  • FIG. 13 is a flowchart showing the flow of processing according to successive operations.
  • step S201-1 when the operation accepting unit 111-1 accepts an operation for starting a game, the game executing unit 115 starts the game.
  • step S202-1 the game execution unit 115 generates a game screen representing a game space in which an operation character exists based on various game information acquired from the server 200. Then, the display control unit 112-1 displays the generated game screen on the touch screen 15.
  • step S203-1 the operation reception unit 111-1 reads operation sequence information about the operation character from the storage unit 120-1.
  • step S204-1 when the operation receiving unit 111-1 receives an operation on the touch screen 15, in step S205-1, the continuous operation determining unit 118 continues the received current operation to the previous operation. Judge whether or not.
  • the continuous operation determination unit 118 determines that the current operation is continuous when the operation representing the action of the operation character based on the previous operation is received before the end of the motion. In other cases, the continuous operation determination unit 118 determines that they are not continuous.
  • the criteria for determining whether or not they are continuous are not limited to this, and other criteria can also be adopted.
  • step S205-1 the continuous operation determination unit 118 refers to the operation sequence information regarding the type of operation of the current operation among the operation sequence information of the operation character. Then, the continuous operation determination unit 118 determines an action associated with the first operation in the operation sequence information as an action to be executed.
  • step S205-1 determines whether or not the current operation is continuous with the previous operation.
  • step S207-1 the continuous operation determination unit 118 determines whether or not the type of the current operation and the type of the previous operation are the same.
  • step S206-1 If it is determined in step S207-1 that they are not of the same type, step S206-1 is executed. As a result, the action associated with the first operation in this type of operation sequence information is determined as the action to be executed.
  • step S208-1 is executed.
  • step S208-1 the continuous operation determination unit 118 specifies the number of steps of this type of operation (that is, the order).
  • step S209-1 the continuous operation determination unit 118 determines whether or not the order specified in step S208-1 is equal to or less than the editable number M of this type of operation.
  • step S209-1 if the order of operations exceeds the editable number M, the operation from step S204-1 is repeated without determining the action to be executed. That is, the same type of operation that exceeds the editable number M is invalid until it is determined that it is not continuous with the previous operation.
  • step S210-1 the continuous operation determination unit 118 determines an action associated with the operation in the order specified in step S208-1 as an action to be executed in the operation sequence information of this type.
  • step S211-1 the game executing unit 115 notifies each unit to cause the operating character to execute the determined action.
  • the animation generation unit 114-1 generates an animation in which the operation character performs a motion of the determined action.
  • the display control unit 112-1 displays the generated animation.
  • the game execution unit 115 controls each unit to cancel the display based on the previous action being executed by the operating character, and to perform the display based on the action determined this time. At this time, if the motion displayed at this time is the return motion of the previous action, the game execution unit 115 controls to immediately cancel the display of the return motion and display the motion of the current action.
  • the game execution unit 115 waits until the display of the attack motion is finished, cancels the display of the return motion, and cancels the current motion. You may control to display the motion. Alternatively, in this case, the game execution unit 115 may control to cancel the display of the attack motion and display the motion of the current action.
  • control part 110-1 repeats the operation
  • the operation sequence information is stored on the editing screen illustrated in FIG.
  • the operation is continuously received in the order of the tap operation, the tap operation, and the left / right flick operation.
  • the action is sent out in the order of action A3 associated with the first-stage tap operation, action A4 associated with the second-stage tap operation, and action C3 associated with the first-stage left / right flick operation.
  • the tap operation is continuously received five times or more.
  • actions are sent out to the fourth level in the order of actions A3, A4, A1, and A5. Thereafter, if the fourth-stage action A5 is an action that attacks the enemy character, the fifth and subsequent tap operations are invalid until the return motion of the action A5 ends.
  • the operation sequence information can be edited for each type of operation, and when a continuous operation is received, the operation character is caused to execute an action determined according to the edited operation sequence information. For this reason, this embodiment can change the game development according to continuous operation, while a user is trial and error. Thereby, the user can enjoy various game development according to the continuous operation, and the interest is further enhanced.
  • the actions that can be associated with each operation included in the operation sequence have been obtained by the action already acquired by the owned character or another owned character of the same type. Is the action. For this reason, this embodiment can make the said character perform the action which suited the world view of the possessed character, or the kind of world view.
  • this embodiment in the editing screen for each possessed character, actions that can be associated with each operation included in the operation sequence are added according to the conditions satisfied by the retained character. Also, the number of operations that can be edited in the operation sequence increases according to the conditions satisfied by the possessed character. For this reason, this embodiment can edit operation sequence information about a possessed character more advantageously, and can strengthen a possessed character, so that a game is played.
  • the editing screen generated by the editing screen generation unit 117 has been described as including a function for editing operation sequence information.
  • the edit screen may include a function for automatically setting recommended operation sequence information.
  • the recommended operation sequence information is operation sequence information in which a recommended action is associated with each operation included in the operation sequence.
  • the edit screen associates the recommended actions for each operation included in the operation sequence in a batch based on the recommended operation sequence information in response to an operation that instructs automatic setting of the recommended operation sequence information.
  • the action icons 115b according to the recommended operation sequence information may be automatically and collectively inserted into the slots 115ai of the operation sequence component 115a.
  • the editing screen may include a function for selecting any of a plurality of recommended operation sequence information and performing automatic setting.
  • the plurality of recommended operation sequence information may be configured to have different characteristics. In this case, the user can select recommended operation sequence information in consideration of characteristics.
  • FIG. 14 is an example of an edit screen for automatically setting recommended operation sequence information.
  • the editing screen of FIG. 14A includes a “recommend” button 115d.
  • the “recommend” button 115d is a button for accepting an operation for instructing automatic setting of recommended operation sequence information.
  • FIG. 14B is an example of a recommended operation sequence information selection screen that is displayed when the “recommend” button 115d is operated on the editing screen of FIG. 14A.
  • the recommended operation sequence information of “Recommendation 1” has a configuration that emphasizes the power of the final decision technique.
  • the recommended operation sequence information of “Recommendation 2” has a configuration that focuses on the fact that no gap is generated between actions that are continuously delivered.
  • FIG. 14C is an example of an editing screen in which “Recommendation 1” is selected and automatic setting is performed on the editing screen of FIG. 14B.
  • the corresponding action icon 115b is automatically fitted in the corresponding slot 115ai in accordance with each type of operation sequence information in “Recommendation 1”.
  • the editing screen may further include a function of presenting a recommended pattern for recommended continuous operation according to automatically set recommended operation sequence information.
  • the recommended pattern of continuous operation is information indicating the type and order of operations recommended as continuous operations, such as “tap, tap, tap, upper flick”, and the like.
  • the user can set the recommended operation sequence information automatically, and then perform the recommended pattern continuously.For example, the user can effectively connect actions that are continuously played until the final decision. Will improve.
  • the edit screen may further include a function for presenting a condition for causing the possessed character to acquire the action if there is an unacquired action that can further enhance the configuration of the recommended operation sequence information.
  • the acquisition condition may be, for example, to participate in a predetermined event in the game space and clear a predetermined mission, but is not limited thereto.
  • the editing screen may have a function of presenting a recommended pattern for continuous operation so as to include the action. For example, it is assumed that there is an action newly acquired by the user and the action has never been associated with the operation sequence information. In this case, a recommended pattern for continuous operation including the action may be presented on the editing screen.
  • the user can easily enjoy the game development based on the recommended operation sequence information by using the recommended operation sequence information.
  • the user can easily learn a pattern of continuous operations effective when such recommended operation sequence information is set.
  • the motivation of the user for playing the game is strengthened.
  • the operation sequence information has been described as being edited for each user.
  • the edit screen generated by the edit screen generation unit 117 can be modified to include a function for performing automatic setting based on operation sequence information edited by another user.
  • the edit screen may have a function of disclosing operation sequence information of any possessed character edited by the user to other users.
  • the editing screen may have a function of acquiring and automatically setting operation sequence information published by other users on the editing screen of an arbitrary possessed character edited by the user.
  • the function of publishing operation sequence information may include a function of publishing user comments together with operation sequence information.
  • the function for automatically setting the operation sequence information of other users is to display the comment that has been released together with the operation sequence information of one or more other users to the user, and then to set the operation sequence information to be automatically set.
  • a function to be selected may be included.
  • FIG. 15 shows an example of the editing screen when the transformation is performed as described above.
  • the editing screen shown in FIG. 15A includes a disclosure button 115e and an acquisition button 115f.
  • the publish button 115e When the publish button 115e is operated, the edit screen generation unit 117 transmits the edited operation sequence information to the server 200 as a publish target.
  • FIG. 15B is an example of a selection screen displayed when the acquisition button 115f is operated on the editing screen of FIG.
  • the edit screen generation unit 117 acquires a list of published operation sequence information from the server 200 in response to an operation on the acquisition button 115f, and displays it in a selectable manner.
  • operation sequence information 1 published by the user 1 and operation sequence information 2 published by the user 2 are displayed in a selectable manner.
  • the edit screen generation unit 117 selects the corresponding action icon 115b based on the operation sequence information selected from the published operation sequence information list. It is sufficient to automatically set the slot 115ai to be set.
  • the user can enjoy the game development based on the operation sequence information set by other users by sharing the operation sequence information with other users.
  • FIG. 16 is an example of an editing screen in this modification.
  • the edit screen includes one or more operation sequence components 315a.
  • Each operation sequence component 315a includes a slot 315ai.
  • a list of action icons 315b is displayed at the bottom of the editing screen.
  • the action icon 315b differs from the above-described action icon 115b in that the type of operation (for example, tap) that can be associated is displayed in addition to the name of the action. Since the other points are configured in the same manner as the action icon 115b, detailed description will not be repeated.
  • the operation sequence component 315a is different from the above-described operation sequence component 115a in that the action icons 315b that can be fitted in the respective slots 315ai may correspond to different types of operations. Since the other points are configured in the same manner as the operation sequence component 115a, detailed description will not be repeated.
  • the operation sequence information saved by reflecting the edited content of the operation sequence component 315a is information in which the operation type and the action set are associated with the operations from the 1st to the Nth stages.
  • the continuous operation determination unit 118 skips the process of step S207-1 in the process shown in FIG.
  • step S210-1 instead of referring to the operation sequence information related to the type of operation this time, it is only necessary to refer to operation sequence information corresponding to the order of the types of operations received so far.
  • Control unit 210 and control blocks of control unit 110-1 may be realized by a logic circuit (hardware) formed in an integrated circuit (IC chip) or the like, or may be a CPU (Central Processing Unit). ) May be implemented by software.
  • the information processing apparatus including the control unit 210 and / or the control unit 110-1 has a CPU that executes instructions of a program that is software that realizes each function, and the program and various data are stored in a computer (or A ROM (Read Only Memory) or a storage device (referred to as “recording medium”) recorded so as to be readable by a CPU), a RAM (Random Access Memory) for expanding the program, and the like are provided.
  • a computer or CPU
  • a “non-temporary tangible medium” such as a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used.
  • the program may be supplied to the computer via an arbitrary transmission medium (such as a communication network or a broadcast wave) that can transmit the program.
  • an arbitrary transmission medium such as a communication network or a broadcast wave
  • one embodiment of the present invention can also be realized in the form of a data signal embedded in a carrier wave, in which the program is embodied by electronic transmission.
  • the game program executed in the computer (100) including the processor (10), the memory (11), and the touch screen (15) has been described.
  • the game program accepts an input operation to the touch screen (15) to the processor (10) (step S103), and controls the placement of the virtual camera in the game space (step S102). ), A step of displaying a captured image of the virtual camera on the touch screen (15) (step S102), and a touch position on the touch screen (15) is moved from the first position (L1) to the second position (L2).
  • the direction of the touch operation determined by the first position (L1) and the second position (L2) and the touch screen (15) are displayed.
  • step S108 to be executed by the game character (C), is running.
  • the actions can be divided according to the direction in which the game character is facing and the direction of the touch operation, the game preference is improved.
  • the step to be executed includes the direction of the touch operation included in a certain range of directions determined by the direction in which the game character (C) faces when the first input operation is received. If so, this includes causing the game character to perform an action for acting on another object. The operations for causing the game character to execute an attacking action become more diverse.
  • the step to be executed accepts the first input operation, and changes the touch position on the touch screen (15) within a certain period before accepting the first input operation.
  • an action for acting on another object is executed.
  • the game character (C) Without causing the game character (C) to perform an action for moving in the direction of the touch operation and the first input operation, and the second input within a certain period before the first input operation is received.
  • the step to be executed when executing the second input operation, causes the game character (C) to execute another action according to the second input operation;
  • the first input operation is received, the second input operation is received within a certain period before the first input operation is received, and the direction of the touch operation is not included in a certain range of directions.
  • the game character (C) is caused to execute an action for moving in the direction of the touch operation without performing an action for acting on another object, and subsequently, the action is exerted on the other object.
  • the game character (C) For causing the game character (C) to execute an action different from other actions. Since the game character can be attacked and avoided by various operations, the taste of the game is improved.
  • the game program causes the processor (10) to move the touch position on the touch screen (15) from the third position to the fourth position.
  • the processor (10) moves the touch position on the touch screen (15) from the third position to the fourth position.
  • the game program causes the processor (10) to determine the direction in which the game character (C) is facing based on the direction of the touch operation corresponding to the third input operation.
  • the change step is further executed.
  • the first input operation is a flick operation
  • the third input operation is a swipe operation. More various actions can be executed by the game character.
  • the step to be executed is included in a certain range of directions determined by the direction in which the game character (C) is facing. For example, when the game character (C) is caused to execute an action for acting on another object and a second input operation that is input without changing the touch position on the touch screen (15) is received. And causing the game character (C) to execute another action for acting on another object, which is different from the action for acting on another object executed when a flick operation is received. including. Since the game character can execute an attacking action by various operations, the game preference is improved.
  • the step to be executed when the flick operation is accepted, the step to be executed must include the direction of the touch operation within a certain range determined by the direction in which the game character (C) is facing. For example, the game character (C) is caused to perform an action for moving without performing an action for acting on another object. Since the flick operation is an operation that the user intuitively and quickly inputs in a game that requires a quick operation, it is possible to intuitively select between attack and avoidance. Therefore, the game preference is improved.
  • the step to be executed is executed after the game character starts executing an action for acting on another object in accordance with the first input operation or the second input operation.
  • the direction of the touch operation is performed without executing an action for acting on another object.
  • Execute the action for moving by the game character and start executing the action for the game character to act on other objects in response to the first input operation or the second input operation If the first input operation is accepted during at least part of the action execution period until When the direction is not included in a certain range of directions, the game character is caused to perform an action for moving in the direction of the touch operation without performing an action for acting on another object, and then And causing the game character to perform an action for acting on another object.
  • (Item 14) A computer-readable recording medium on which the game program described in any one of (Item 1) to (Item 13) is recorded.
  • the computer includes a processor, a memory, and a touch screen, and executes a game program.
  • the method includes a step in which the computer (100) receives an input operation on the touch screen (15), a step of controlling the placement of the virtual camera in the game space, and a captured image of the virtual camera.
  • the step of displaying on the touch screen (15) and the first input operation input by moving the touch position on the touch screen (15) from the first position (L1) to the second position (L2) are accepted.
  • the direction of the touch operation determined by the first position and the second position is compared with the direction in which the game character is facing on the screen displayed on the touch screen (15), and according to the comparison result Performing an action on the game character (C).
  • the information processing apparatus (100) includes a storage unit (120) configured to store a game program, a control unit (110) configured to control the operation of the information processing device (100), and a touch screen. (15).
  • the control unit accepts an input operation on the touch screen (15), controls the placement of the virtual camera in the game space, displays the captured image of the virtual camera on the touch screen (15), A touch determined by the first position and the second position when a first input operation input by moving the touch position on the touch screen (15) from the first position to the second position is received.
  • the direction of operation is compared with the direction in which the game character (C) is facing on the screen displayed on the touch screen (15), and the game character (C) is caused to perform an action according to the comparison result.
  • a game based on a game program is a game that causes an operation character operated by a user to execute an action associated with the operation when an operation on the touch screen is accepted.
  • the game program causes the processor to execute a step of displaying an editing screen that associates each operation included in the operation sequence including continuous operations on the touch screen with an action according to the order in the operation sequence of the operation.
  • the user can associate an action corresponding to the order of the operation sequence with respect to each operation included in the operation sequence. And when a user performs operation continuously, it can make an operation character perform continuously the action linked
  • the type of operation may include at least one of a tap operation, a flick operation, and a rotation operation.
  • the rotation operation is an operation in which the locus of the touch position with respect to the touch screen becomes a ring shape or a substantially ring shape.
  • the type of operation may include a type based on the direction of the operation.
  • the direction of operation is the direction of movement of the touch position with respect to the touch screen.
  • the action that can be associated with each operation included in the operation sequence on the edit screen has a characteristic according to the type of the associated operation. May be. Thereby, when a user performs continuous operation, since the action associated with each operation can be easily recalled based on the type, the operability is improved.
  • the step of displaying the edit screen includes each operation included in the operation sequence for each possessed character held by the user as a character that can be an operation character. You may display the edit screen which associates the action according to the order in the operation row
  • the action that can be associated with each operation included in the operation sequence on the editing screen related to the retained character may include an action that the retained character has already acquired in the game.
  • the user can cause the operating character to execute an action that matches the character's view of the world.
  • the actions that can be associated with each operation included in the operation sequence on the editing screen related to the possessed character include actions that have already been acquired by other retained characters of the same type as the retained character in the game. May be. Thus, the user can cause the operating character to execute various actions that match the world view of the character type.
  • the number of operations that can be included in the operation sequence may be equal to or less than the maximum number.
  • the user can cause the operation character to continuously execute up to the maximum number of actions using one operation sequence.
  • (Item 10) is applied in (Item 2), the user can cause the operation character to execute an action that exceeds the maximum number by connecting a plurality of types of operation strings up to the maximum number. Therefore, the interest of the game increases.
  • actions may be associated with operations up to the editable number set to the maximum number or less on the edit screen.
  • the user can cause the operation character to continuously execute actions up to the editable number with one operation sequence.
  • the user since the user can predict that the editable number can be increased to the maximum number, the user's motivation for playing the game is enhanced.
  • the cost is set for the action, and the action associated with each operation included in one or more operation sequences that can be edited on the edit screen It may be possible to associate actions with operations within a range in which the sum of the costs set in (1) does not exceed a predetermined cost upper limit value. Thereby, it is possible to prevent the interest of the game from deteriorating due to the fact that the action continuously executed by the operation character according to the continuous operation becomes too advantageous in developing the game.
  • the cost upper limit value when a predetermined condition is satisfied in the game, the cost upper limit value may be increased. Thereby, in order to increase the cost upper limit value, the user is motivated to play the game.
  • the step of displaying the edit screen is based on information in which an action is associated with each operation included in the operation sequence by another user.
  • actions may be collectively associated with each operation included in the operation sequence. Thereby, the user can enjoy the game development according to continuous operation using the action associated with each operation included in the operation sequence by another user.
  • the step of displaying the edit screen may associate the recommended actions for each operation included in the operation sequence in a lump. . Thereby, the user can enjoy the game development according to the continuous operation using the action recommended for each operation included in the operation sequence.
  • the game program (131) is executed by a computer including a processor (10), a memory (11), and a touch screen (15).
  • the operation on the touch screen is accepted, the operation character operated by the user is caused to execute an action associated with the operation.
  • the processor displays an editing screen in which each operation included in an operation sequence including continuous operations on the touch screen is associated with an action according to the order in the operation sequence of the operation.
  • the method according to (Item 17) has the same effects as the game program according to (Item 1).
  • the information processing device includes a storage unit (120) that stores the game program (131), and a control unit (110) that controls the operation of the information processing device by executing the game program. And a touch screen (15).
  • a game based on a game program is a game that causes an operation character operated by a user to execute an action associated with the operation when an operation on the touch screen is accepted.
  • the control unit displays an editing screen that associates each operation included in the operation sequence including continuous operations on the touch screen with an action according to the order of the operation sequence of the operation.
  • the information processing apparatus according to (Item 18) has the same effects as the game program according to (Item 1).
  • 1 game system 2 networks, 10,20 processor, 11,21 memory, 12,22 storage, 13,23 communication IF, 14,24 input / output IF, 15 touch screen, 17 camera, 18 distance sensor, 100 user terminal (Information processing device, client computer), 110, 210 control unit, 111-1 operation reception unit, 112-1 display control unit, 113-1 UI control unit, 114-1 animation generation unit, 115 game execution unit, 116 Camera arrangement control unit, 117 edit screen generation unit, 118 continuous operation determination unit, 120 storage unit, 131 game program, 132 game information, 133 user information, 151 input unit, 152 display unit, 200 server (information processing apparatus, server Computer) 1010 object 1020 controller 1030 storage medium

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention enables various types of input operations. A control unit (110) of a user terminal is configured to: accept an input operation performed on a touch screen (15); control the arrangement of a virtual camera; display an image captured by the virtual camera on the touch screen; and in the case of accepting an operation that is inputted by shifting the touch position on the touch screen, cause a game character to execute an action according to a result of the comparison between the direction of the touch operation and the direction in which the game character is oriented.

Description

ゲームプログラム、方法および情報処理装置GAME PROGRAM, METHOD, AND INFORMATION PROCESSING DEVICE

 本開示は、ゲームプログラム、方法および情報処理装置に関する。 The present disclosure relates to a game program, a method, and an information processing apparatus.

 特許文献1には、ユーザによるタッチ入力によりゲームを進行するインターフェース・プログラムが記載されている。 Patent Document 1 describes an interface program for progressing a game by a touch input by a user.

特開2016-129591号公報(2016年7月21日公開)JP 2016-129591 A (published July 21, 2016)

 特許文献1に開示のインターフェース・プログラムによれば、ユーザはタイミングに合わせてタッチ入力しようとするため、高い趣向性のゲームを提供できる。しかし、さらに趣向性を向上させる技術が求められている。 According to the interface program disclosed in Patent Document 1, since the user tries to perform touch input in accordance with the timing, a highly interesting game can be provided. However, there is a need for a technique that further improves the taste.

 本開示の目的は、より趣向性の高いアクションゲームを提供することにある。 The purpose of this disclosure is to provide a more interesting action game.

 上記の課題を解決するために、本発明の一態様に係るゲームプログラムは、プロセッサと、メモリと、タッチスクリーンとを備えるコンピュータにより実行されるゲームプログラムである。ゲームプログラムに基づくゲームは、タッチスクリーンに対する操作を受け付けると、該操作に関連付けられたアクションを、ユーザが操作する操作キャラクタに実行させるゲームである。ゲームプログラムは、プロセッサに、タッチスクリーンに対する連続する操作よりなる操作列に含まれる各操作に、該操作の操作列における順序に応じたアクションを関連付ける編集画面を表示するステップ、を実行させる。 In order to solve the above problems, a game program according to an aspect of the present invention is a game program executed by a computer including a processor, a memory, and a touch screen. A game based on a game program is a game that causes an operation character operated by a user to execute an action associated with the operation when an operation on the touch screen is accepted. The game program causes the processor to execute a step of displaying an editing screen that associates each operation included in the operation sequence including continuous operations on the touch screen with an action according to the order in the operation sequence of the operation.

 また、上記の課題を解決するために、本発明の一態様に係る方法は、ゲームプログラムを実行する方法である。ゲームプログラムは、プロセッサと、メモリと、タッチスクリーンとを備えるコンピュータにより実行されるものである、ゲームプログラムに基づくゲームは、タッチスクリーンに対する操作を受け付けると、該操作に関連付けられたアクションを、ユーザが操作する操作キャラクタに実行させるゲームである。方法は、プロセッサが、タッチスクリーンに対する連続する操作よりなる操作列に含まれる各操作に、該操作の操作列における順序に応じたアクションを関連付ける編集画面を表示する。 In order to solve the above-described problem, a method according to an aspect of the present invention is a method for executing a game program. The game program is executed by a computer including a processor, a memory, and a touch screen. When a game based on the game program receives an operation on the touch screen, the user performs an action associated with the operation. This is a game to be executed by an operating character to be operated. In the method, the processor displays an editing screen in which each operation included in an operation sequence including continuous operations on the touch screen is associated with an action according to the order in the operation sequence of the operation.

 また、上記の課題を解決するために、本発明の一態様に係る情報処理装置は、ゲームプログラムを記憶する記憶部と、ゲームプログラムを実行することにより、情報処理装置の動作を制御する制御部と、タッチスクリーンと、を備える。ゲームプログラムに基づくゲームは、タッチスクリーンに対する操作を受け付けると、該操作に関連付けられたアクションを、ユーザが操作する操作キャラクタに実行させるゲームである。制御部は、タッチスクリーンに対する連続する操作よりなる操作列に含まれる各操作に、該操作の操作列における順序に応じたアクションを関連付ける編集画面を表示する。 In order to solve the above problems, an information processing apparatus according to an aspect of the present invention includes a storage unit that stores a game program, and a control unit that controls the operation of the information processing apparatus by executing the game program. And a touch screen. A game based on a game program is a game that causes an operation character operated by a user to execute an action associated with the operation when an operation on the touch screen is accepted. The control unit displays an editing screen that associates each operation included in the operation sequence including continuous operations on the touch screen with an action according to the order of the operation sequence of the operation.

 本開示によれば、より趣向性の高いアクションゲームを提供することができる。 According to the present disclosure, it is possible to provide a more interesting action game.

ゲームシステムのハードウェア構成を示す図である。It is a figure which shows the hardware constitutions of a game system. ユーザ端末の機能的構成を示す図である。It is a figure which shows the functional structure of a user terminal. 入力操作の種類を検知するために入力操作受付部が参照する履歴情報テーブルの一例を示す図である。It is a figure which shows an example of the log | history information table which an input operation reception part refers in order to detect the kind of input operation. ゲームシステムによって実行される処理の流れを示すフローチャートの一例である。It is an example of the flowchart which shows the flow of the process performed by a game system. 表示制御部がタッチスクリーンに表示させるゲーム空間画像の一例を模式的に示す図である。It is a figure which shows typically an example of the game space image which a display control part displays on a touch screen. ゲームシステムによって実行される処理の流れを示すフローチャートの他の例である。It is another example of the flowchart which shows the flow of the process performed by a game system. オブジェクト制御部がコンボデータに基づいてキャラクタに攻撃動作を実行させたときのアクションの一例を示す図である。It is a figure which shows an example of an action when an object control part makes a character perform attack action based on combo data. ユーザ端末およびサーバの機能的構成を示すブロック図である。It is a block diagram which shows the functional structure of a user terminal and a server. (A)及び(B)は、回転操作の検出手法の一例を模式的に説明する図である。(A) And (B) is a figure which illustrates typically an example of the detection method of rotation operation. (A)及び(B)は、表示部に表示される編集画面の具体例を示す図である(A) And (B) is a figure which shows the specific example of the edit screen displayed on a display part. 本実施形態に係るゲームプログラムに基づいて、編集画面において実行される処理の流れを示すフローチャートである。It is a flowchart which shows the flow of the process performed in an edit screen based on the game program which concerns on this embodiment. (A)~(C)は、表示部に表示される編集画面の具体例を示す図である。(A)-(C) are figures which show the specific example of the edit screen displayed on a display part. 本実施形態に係るゲームプログラムに基づいて、連続する操作に応じて実行される処理の流れを示すフローチャートである。It is a flowchart which shows the flow of the process performed according to continuous operation based on the game program which concerns on this embodiment. (A)~(C)は、表示部に表示される編集画面においてお勧め操作列情報が自動設定される具体例を示す図である。(A) to (C) are diagrams showing specific examples in which recommended operation sequence information is automatically set on an editing screen displayed on the display unit. (A)~(B)は、表示部に表示される編集画面において操作列情報が共有される具体例を示す図である。(A)-(B) is a figure which shows the specific example in which operation sequence information is shared in the edit screen displayed on a display part. 表示部に表示される編集画面の他の具体例を示す図である。It is a figure which shows the other specific example of the edit screen displayed on a display part.

 〔実施形態〕
 本開示に係るゲームシステムは、複数のユーザにゲームを提供するためのシステムである。以下、ゲームシステムについて図面を参照しつつ説明する。なお、本発明はこれらの例示に限定されるものではなく、特許請求の範囲によって示され、特許請求の範囲と均等の意味および範囲内でのすべての変更が本発明に含まれることが意図される。以下の説明では、図面の説明において同一の要素には同一の符号を付し、重複する説明を繰り返さない。
Embodiment
A game system according to the present disclosure is a system for providing a game to a plurality of users. Hereinafter, the game system will be described with reference to the drawings. It should be noted that the present invention is not limited to these exemplifications, but is defined by the scope of claims, and is intended to include all modifications within the meaning and scope equivalent to the scope of claims. The In the following description, the same reference numerals are given to the same elements in the description of the drawings, and repeated description is not repeated.

 <ゲームシステム1のハードウェア構成>
 図1は、ゲームシステム1のハードウェア構成を示す図である。ゲームシステム1は図示の通り、複数のユーザ端末100と、サーバ200とを含む。各ユーザ端末100は、サーバ200とネットワーク2を介して接続する。ネットワーク2は、インターネットおよび図示しない無線基地局によって構築される各種移動通信システム等で構成される。この移動通信システムとしては、例えば、所謂3G、4G移動通信システム、LTE(LongTerm Evolution)、および所定のアクセスポイントによってインターネットに接続可能な無線ネットワーク(例えばWi-Fi(登録商標))等が挙げられる。
<Hardware configuration of game system 1>
FIG. 1 is a diagram illustrating a hardware configuration of the game system 1. The game system 1 includes a plurality of user terminals 100 and a server 200 as illustrated. Each user terminal 100 is connected to the server 200 via the network 2. The network 2 includes various mobile communication systems constructed by the Internet and a radio base station (not shown). Examples of the mobile communication system include a so-called 3G and 4G mobile communication system, LTE (Long Term Evolution), and a wireless network (for example, Wi-Fi (registered trademark)) that can be connected to the Internet through a predetermined access point. .

 サーバ200(コンピュータ、情報処理装置)は、ワークステーションまたはパーソナルコンピュータ等の汎用コンピュータであってよい。サーバ200は、プロセッサ20と、メモリ21と、ストレージ22と、通信IF23と、入出力IF24とを備える。サーバ200が備えるこれらの構成は、通信バスによって互いに電気的に接続される。 The server 200 (computer, information processing apparatus) may be a general-purpose computer such as a workstation or a personal computer. The server 200 includes a processor 20, a memory 21, a storage 22, a communication IF 23, and an input / output IF 24. These components included in the server 200 are electrically connected to each other via a communication bus.

 ユーザ端末100(コンピュータ、情報処理装置)は、スマートフォン、フィーチャーフォン、PDA(Personal Digital Assistant)、またはタブレット型コンピュータ等の携帯端末であってよい。ユーザ端末100は、ゲームプレイに適したゲーム装置であってもよい。ユーザ端末100は図示の通り、プロセッサ10と、メモリ11と、ストレージ12と、通信インターフェース(IF)13と、入出力IF14と、タッチスクリーン15(表示部)と、カメラ17と、測距センサ18とを備える。ユーザ端末100が備えるこれらの構成は、通信バスによって互いに電気的に接続される。また、図1に示すように、ユーザ端末100は、1つ以上のコントローラ1020と通信可能に構成されることとしてもよい。コントローラ1020は、例えば、Bluetooth(登録商標)等の通信規格に従って、ユーザ端末100と通信を確立する。コントローラ1020は、1つ以上のボタン等を有していてもよく、該ボタン等に対するユーザの入力操作に基づく出力値をユーザ端末100へ送信する。また、コントローラ1020は、加速度センサ、および、角速度センサ等の各種センサを有していてもよく、該各種センサの出力値をユーザ端末100へ送信する。 The user terminal 100 (computer, information processing apparatus) may be a mobile terminal such as a smartphone, a feature phone, a PDA (Personal Digital Assistant), or a tablet computer. The user terminal 100 may be a game device suitable for game play. As illustrated, the user terminal 100 includes a processor 10, a memory 11, a storage 12, a communication interface (IF) 13, an input / output IF 14, a touch screen 15 (display unit), a camera 17, and a distance measuring sensor 18. With. These components included in the user terminal 100 are electrically connected to each other via a communication bus. Further, as shown in FIG. 1, the user terminal 100 may be configured to be able to communicate with one or more controllers 1020. For example, the controller 1020 establishes communication with the user terminal 100 in accordance with a communication standard such as Bluetooth (registered trademark). The controller 1020 may have one or more buttons and the like, and transmits an output value based on a user input operation to the buttons and the like to the user terminal 100. The controller 1020 may include various sensors such as an acceleration sensor and an angular velocity sensor, and transmits output values of the various sensors to the user terminal 100.

 なお、ユーザ端末100がカメラ17および測距センサ18を備えることに代えて、または、加えて、コントローラ1020がカメラ17および測距センサ18を有していてもよい。 Note that the controller 1020 may include the camera 17 and the distance measuring sensor 18 instead of or in addition to the user terminal 100 including the camera 17 and the distance measuring sensor 18.

 ユーザ端末100は、例えばゲーム開始時に、コントローラ1020を使用するユーザに、該ユーザの名前またはログインID等のユーザ識別情報を、該コントローラ1020を介して入力させることが望ましい。これにより、ユーザ端末100は、コントローラ1020とユーザとを紐付けることが可能となり、受信した出力値の送信元(コントローラ1020)に基づいて、該出力値がどのユーザのものであるかを特定することができる。 The user terminal 100 desirably allows a user who uses the controller 1020 to input user identification information such as the user's name or login ID via the controller 1020 at the start of the game, for example. As a result, the user terminal 100 can associate the controller 1020 with the user, and identifies which user the output value belongs to based on the transmission source (controller 1020) of the received output value. be able to.

 ユーザ端末100が複数のコントローラ1020と通信する場合、各コントローラ1020を各ユーザが把持することで、ネットワーク2を介してサーバ200などの他の装置と通信せずに、該1台のユーザ端末100でマルチプレイを実現することができる。また、各ユーザ端末100が無線LAN(Local Area Network)規格等の無線規格により互いに通信接続する(サーバ200を介さずに通信接続する)ことで、複数台のユーザ端末100によりローカルでマルチプレイを実現することもできる。1台のユーザ端末100によりローカルで上述のマルチプレイを実現する場合、ユーザ端末100は、さらに、サーバ200が備える後述する種々の機能の少なくとも一部を備えていてもよい。また、複数のユーザ端末100によりローカルで上述のマルチプレイを実現する場合、複数のユーザ端末100は、サーバ200が備える後述する種々の機能を分散して備えていてもよい。 When the user terminal 100 communicates with a plurality of controllers 1020, each user grasps each controller 1020, so that the one user terminal 100 does not communicate with other devices such as the server 200 via the network 2. Multi-play can be realized. In addition, each user terminal 100 is connected to each other by a wireless standard such as a wireless LAN (Local Area Network) standard (communication connection is not performed via the server 200), so that a plurality of user terminals 100 can perform multiplayer locally. You can also When the above-described multiplay is realized locally by one user terminal 100, the user terminal 100 may further include at least a part of various functions described later included in the server 200. Further, when the above-described multi-play is realized locally by a plurality of user terminals 100, the plurality of user terminals 100 may be provided with various functions described later included in the server 200 in a distributed manner.

 なお、ローカルで上述のマルチプレイを実現する場合であっても、ユーザ端末100はサーバ200と通信を行ってもよい。例えば、あるゲームにおける成績または勝敗等のプレイ結果を示す情報と、ユーザ識別情報とを対応付けてサーバ200に送信してもよい。 Note that the user terminal 100 may communicate with the server 200 even when the above-described multiplayer is realized locally. For example, information indicating a play result such as a result or win / loss in a certain game may be associated with the user identification information and transmitted to the server 200.

 また、コントローラ1020は、ユーザ端末100に着脱可能な構成であるとしてもよい。この場合、ユーザ端末100の筐体における少なくともいずれかの面に、コントローラ1020との結合部が設けられていてもよい。該結合部を介して有線によりユーザ端末100とコントローラ1020とが結合している場合は、ユーザ端末100とコントローラ1020とは、有線を介して信号を送受信する。 Further, the controller 1020 may be configured to be detachable from the user terminal 100. In this case, a coupling portion with the controller 1020 may be provided on at least one surface of the housing of the user terminal 100. When the user terminal 100 and the controller 1020 are coupled by wire through the coupling unit, the user terminal 100 and the controller 1020 transmit and receive signals via the wire.

 図1に示すように、ユーザ端末100は、外部のメモリカード等の記憶媒体1030の装着を、入出力IF14を介して受け付けてもよい。これにより、ユーザ端末100は、記憶媒体1030に記録されるプログラム及びデータを読み込むことができる。記憶媒体1030に記録されるプログラムは、例えばゲームプログラムである。 As shown in FIG. 1, the user terminal 100 may accept attachment of a storage medium 1030 such as an external memory card via the input / output IF 14. Accordingly, the user terminal 100 can read the program and data recorded in the storage medium 1030. The program recorded in the storage medium 1030 is a game program, for example.

 ユーザ端末100は、サーバ200等の外部の装置と通信することにより取得したゲームプログラムをユーザ端末100のメモリ11に記憶してもよいし、記憶媒体1030から読み込むことにより取得したゲームプログラムをメモリ11に記憶してもよい。 The user terminal 100 may store the game program acquired by communicating with an external device such as the server 200 in the memory 11 of the user terminal 100, or store the game program acquired by reading from the storage medium 1030 in the memory 11. May be stored.

 以上で説明したとおり、ユーザ端末100は、該ユーザ端末100に対して情報を入力する機構の一例として、通信IF13、入出力IF14、タッチスクリーン15、カメラ17、および、測距センサ18を備える。入力する機構としての上述の各部は、ユーザの入力操作を受け付けるように構成された操作部と捉えることができる。 As described above, the user terminal 100 includes the communication IF 13, the input / output IF 14, the touch screen 15, the camera 17, and the distance measuring sensor 18 as an example of a mechanism for inputting information to the user terminal 100. Each of the above-described units serving as an input mechanism can be regarded as an operation unit configured to accept a user input operation.

 例えば、操作部が、カメラ17および測距センサ18の少なくともいずれか一方で構成される場合、該操作部が、ユーザ端末100の近傍の物体1010を検出し、当該物体の検出結果から入力操作を特定する。一例として、物体1010としてのユーザの手、予め定められた形状のマーカーなどが検出され、検出結果として得られた物体1010の色、形状、動き、または、種類などに基づいて入力操作が特定される。より具体的には、ユーザ端末100は、カメラ17の撮影画像からユーザの手が検出された場合、該撮影画像に基づき検出されるジェスチャ(ユーザの手の一連の動き)を、ユーザの入力操作として特定し、受け付ける。なお、撮影画像は静止画であっても動画であってもよい。 For example, when the operation unit is configured by at least one of the camera 17 and the distance measuring sensor 18, the operation unit detects an object 1010 in the vicinity of the user terminal 100, and performs an input operation from the detection result of the object. Identify. As an example, a user's hand as an object 1010, a marker having a predetermined shape, or the like is detected, and an input operation is specified based on the color, shape, movement, or type of the object 1010 obtained as a detection result. The More specifically, when a user's hand is detected from a captured image of the camera 17, the user terminal 100 performs a user's input operation on a gesture (a series of movements of the user's hand) detected based on the captured image. Identify and accept as Note that the captured image may be a still image or a moving image.

 あるいは、操作部がタッチスクリーン15で構成される場合、ユーザ端末100は、タッチスクリーン15の入力部151に対して実施されたユーザの操作をユーザの入力操作として特定し、受け付ける。あるいは、操作部が通信IF13で構成される場合、ユーザ端末100は、コントローラ1020から送信される信号(例えば、出力値)をユーザの入力操作として特定し、受け付ける。あるいは、操作部が入出力IF14で構成される場合、該入出力IF14と接続されるコントローラ1020とは異なる入力装置(図示せず)から出力される信号をユーザの入力操作として特定し、受け付ける。 Alternatively, when the operation unit is configured by the touch screen 15, the user terminal 100 identifies and accepts a user operation performed on the input unit 151 of the touch screen 15 as a user input operation. Or when an operation part is comprised by communication IF13, the user terminal 100 specifies and receives the signal (for example, output value) transmitted from the controller 1020 as a user's input operation. Alternatively, when the operation unit includes the input / output IF 14, a signal output from an input device (not shown) different from the controller 1020 connected to the input / output IF 14 is specified and accepted as a user input operation.

 <ゲーム概要>
 ゲームシステム1に基づくゲームは、ユーザの操作に応じて、ユーザによって操作されるキャラクタがアクションを実行するゲームである。以降、ユーザによって操作されるキャラクタを、操作キャラクタと記載する。詳細には、ゲームシステム1に基づくゲームは、ユーザのタッチスクリーン15に対する連続した操作に応じて、操作キャラクタに連続したアクションを実行させることが可能なゲームである。そのようなゲームの一例としては、操作キャラクタが敵キャラクタと戦闘する際にコンボアクションを実行するアクションゲームが挙げられる。また、そのようなアクションゲームを含むRPG(Role-Playing Game)が挙げられる。また、そのようなRPGは、一例として、複数のユーザが各自のユーザ端末を介して同時に1つのゲーム空間に参加するMMORPG(Massively Multiplayer Online Role-Playing Game)であってもよい。また、そのようなMMORPGは、一例として、操作キャラクタが仮想的なゲーム空間内を自由に移動可能なオープンワールドのゲームであってもよい。ただし、ゲームシステム1に基づくゲームは、これらの例示したタイプのゲームに限定されない。
<Game overview>
A game based on the game system 1 is a game in which a character operated by a user executes an action in response to the user's operation. Hereinafter, the character operated by the user is referred to as an operation character. Specifically, the game based on the game system 1 is a game that allows the operation character to execute a continuous action in response to a continuous operation on the touch screen 15 by the user. An example of such a game is an action game in which a combo action is executed when an operation character battles an enemy character. Moreover, RPG (Role-Playing Game) including such an action game is mentioned. In addition, such an RPG may be, for example, a MMORPG (Massively Multiplayer Online Role-Playing Game) in which a plurality of users simultaneously participate in one game space via their own user terminals. Such an MMORPG may be an open world game in which an operation character can freely move in a virtual game space, for example. However, the game based on the game system 1 is not limited to these illustrated types of games.

 <各装置のハードウェア構成要素>
 プロセッサ10は、ユーザ端末100全体の動作を制御する。プロセッサ20は、サーバ200全体の動作を制御する。プロセッサ10および20は、CPU(Central Processing Unit)、MPU(Micro Processing Unit)、およびGPU(Graphics Processing Unit)を含む。
<Hardware components of each device>
The processor 10 controls the operation of the entire user terminal 100. The processor 20 controls the operation of the entire server 200. The processors 10 and 20 include a central processing unit (CPU), a micro processing unit (MPU), and a graphics processing unit (GPU).

 プロセッサ10は後述するストレージ12からプログラムを読み出し、後述するメモリ11に展開する。プロセッサ20は後述するストレージ22からプログラムを読み出し、後述するメモリ21に展開する。プロセッサ10およびプロセッサ20は展開したプログラムを実行する。 The processor 10 reads a program from the storage 12 described later and expands it in the memory 11 described later. The processor 20 reads a program from a storage 22 described later and develops it in a memory 21 described later. The processor 10 and the processor 20 execute the developed program.

 メモリ11および21は主記憶装置である。メモリ11および21は、ROM(Read Only Memory)およびRAM(Random Access Memory)等の記憶装置で構成される。メモリ11は、プロセッサ10が後述するストレージ12から読み出したプログラムおよび各種データを一時的に記憶することにより、プロセッサ10に作業領域を提供する。メモリ11は、プロセッサ10がプログラムに従って動作している間に生成した各種データも一時的に記憶する。メモリ21は、プロセッサ20が後述するストレージ22から読み出した各種プログラムおよびデータを一時的に記憶することにより、プロセッサ20に作業領域を提供する。メモリ21は、プロセッサ20がプログラムに従って動作している間に生成した各種データも一時的に記憶する。 The memories 11 and 21 are main storage devices. The memories 11 and 21 include storage devices such as a ROM (Read Only Memory) and a RAM (Random Access Memory). The memory 11 provides a work area to the processor 10 by temporarily storing a program and various data read from the storage 12 described later by the processor 10. The memory 11 also temporarily stores various data generated while the processor 10 is operating according to the program. The memory 21 provides the work area to the processor 20 by temporarily storing various programs and data read from the storage 22 described later by the processor 20. The memory 21 temporarily stores various data generated while the processor 20 is operating according to the program.

 本実施形態においてプログラムとは、ゲームをユーザ端末100により実現するためのゲームプログラムであってもよい。あるいは、該プログラムは、該ゲームをユーザ端末100とサーバ200との協働により実現するためのゲームプログラムであってもよい。あるいは、該プログラムは、該ゲームを複数のユーザ端末100の協働により実現するためのゲームプログラムであってもよい。また、各種データとはユーザ情報、ゲーム情報等、ゲームに関するデータ、ならびにユーザ端末100とサーバ200との間または複数のユーザ端末100間で送受信する指示や通知を含んでいる。 In the present embodiment, the program may be a game program for realizing the game by the user terminal 100. Alternatively, the program may be a game program for realizing the game by cooperation between the user terminal 100 and the server 200. Alternatively, the program may be a game program for realizing the game by cooperation of a plurality of user terminals 100. The various data includes data related to the game such as user information and game information, and instructions and notifications transmitted and received between the user terminal 100 and the server 200 or between the plurality of user terminals 100.

 ストレージ12および22は補助記憶装置である。ストレージ12および22は、フラッシュメモリまたはHDD(Hard Disk Drive)等の記憶装置で構成される。ストレージ12およびストレージ22には、ゲームに関する各種データが格納される。 Storage 12 and 22 are auxiliary storage devices. The storages 12 and 22 are configured by a storage device such as a flash memory or an HDD (Hard Disk Drive). Various data relating to the game is stored in the storage 12 and the storage 22.

 通信IF13は、ユーザ端末100における各種データの送受信を制御する。通信IF23は、サーバ200における各種データの送受信を制御する。通信IF13および23は例えば、無線LAN(Local Area Network)を介する通信、有線LAN、無線LAN、または携帯電話回線網を介したインターネット通信、ならびに近距離無線通信等を用いた通信を制御する。 The communication IF 13 controls transmission / reception of various data in the user terminal 100. The communication IF 23 controls transmission / reception of various data in the server 200. The communication IFs 13 and 23 control communication using, for example, communication via a wireless LAN (Local Area Network), Internet communication via a wired LAN, a wireless LAN, or a mobile phone network, and short-range wireless communication.

 入出力IF14は、ユーザ端末100がデータの入力を受け付けるためのインターフェースであり、またユーザ端末100がデータを出力するためのインターフェースである。入出力IF14は、USB(Universal Serial Bus)等を介してデータの入出力を行ってもよい。入出力IF14は、例えば、ユーザ端末100の物理ボタン、カメラ、マイク、または、スピーカ等を含み得る。サーバ200の入出力IF24は、サーバ200がデータの入力を受け付けるためのインターフェースであり、またサーバ200がデータを出力するためのインターフェースである。入出力IF24は、例えば、マウスまたはキーボード等の情報入力機器である入力部と、画像を表示出力する機器である表示部とを含み得る。 The input / output IF 14 is an interface for the user terminal 100 to accept data input, and is an interface for the user terminal 100 to output data. The input / output IF 14 may input / output data via a USB (Universal Serial Bus) or the like. The input / output IF 14 may include, for example, a physical button of the user terminal 100, a camera, a microphone, a speaker, or the like. The input / output IF 24 of the server 200 is an interface for the server 200 to accept data input, and is an interface for the server 200 to output data. The input / output IF 24 may include, for example, an input unit that is an information input device such as a mouse or a keyboard, and a display unit that is a device that displays and outputs an image.

 ユーザ端末100のタッチスクリーン15は、入力部151と表示部152とを組み合わせた電子部品である。入力部151は、例えばタッチセンシティブなデバイスであり、例えばタッチパッドによって構成される。表示部152は、例えば液晶ディスプレイ、または有機EL(Electro-Luminescence)ディスプレイ等によって構成される。 The touch screen 15 of the user terminal 100 is an electronic component in which an input unit 151 and a display unit 152 are combined. The input unit 151 is a touch-sensitive device, for example, and is configured by a touch pad, for example. The display unit 152 is configured by, for example, a liquid crystal display or an organic EL (Electro-Luminescence) display.

 入力部151は、入力面に対しユーザの操作(主にタッチ操作、スライド操作、スワイプ操作、およびタップ操作等の物理的接触操作)が入力された位置を検知して、位置を示す情報を入力信号として送信する機能を備える。入力部151は、図示しないタッチセンシング部を備えていればよい。タッチセンシング部は、静電容量方式または抵抗膜方式等のどのような方式を採用したものであってもよい。 The input unit 151 detects a position where a user operation (physical contact operation such as a touch operation, a slide operation, a swipe operation, and a tap operation) is input to the input surface, and inputs information indicating the position. A function of transmitting as a signal is provided. The input unit 151 may include a touch sensing unit (not shown). The touch sensing unit may adopt any method such as a capacitance method or a resistance film method.

 図示していないが、ユーザ端末100は、該ユーザ端末100の保持姿勢を特定するための1以上のセンサを備えていてもよい。このセンサは、例えば、加速度センサ、または、角速度センサ等であってもよい。ユーザ端末100がセンサを備えている場合、プロセッサ10は、センサの出力からユーザ端末100の保持姿勢を特定して、保持姿勢に応じた処理を行うことも可能になる。例えば、プロセッサ10は、ユーザ端末100が縦向きに保持されているときには、縦長の画像を表示部152に表示させる縦画面表示としてもよい。一方、ユーザ端末100が横向きに保持されているときには、横長の画像を表示部に表示させる横画面表示としてもよい。このように、プロセッサ10は、ユーザ端末100の保持姿勢に応じて縦画面表示と横画面表示とを切り替え可能であってもよい。 Although not shown, the user terminal 100 may include one or more sensors for specifying the holding posture of the user terminal 100. This sensor may be, for example, an acceleration sensor or an angular velocity sensor. When the user terminal 100 includes a sensor, the processor 10 can specify the holding posture of the user terminal 100 from the output of the sensor and perform processing according to the holding posture. For example, when the user terminal 100 is held in the portrait orientation, the processor 10 may perform a portrait screen display in which a portrait image is displayed on the display unit 152. On the other hand, when the user terminal 100 is held sideways, a horizontal screen display in which a horizontally long image is displayed on the display unit may be used. As described above, the processor 10 may be able to switch between the vertical screen display and the horizontal screen display according to the holding posture of the user terminal 100.

 カメラ17は、イメージセンサ等を含み、レンズから入射する入射光を電気信号に変換することで撮影画像を生成する。 The camera 17 includes an image sensor and the like, and generates a captured image by converting incident light incident from the lens into an electric signal.

 測距センサ18は、測定対象物までの距離を測定するセンサである。測距センサ18は、例えば、パルス変換した光を発する光源と、光を受ける受光素子とを含む。測距センサ18は、光源からの発光タイミングと、該光源から発せられた光が測定対象物にあたって反射されて生じる反射光の受光タイミングとにより、測定対象物までの距離を測定する。測距センサ18は、指向性を有する光を発する光源を有することとしてもよい。 The distance measuring sensor 18 is a sensor that measures the distance to the measurement object. The distance measuring sensor 18 includes, for example, a light source that emits pulse-converted light and a light receiving element that receives the light. The distance measuring sensor 18 measures the distance to the measurement object based on the light emission timing from the light source and the light reception timing of the reflected light generated when the light emitted from the light source is reflected by the measurement object. The distance measuring sensor 18 may include a light source that emits light having directivity.

 ここで、ユーザ端末100が、カメラ17と測距センサ18とを用いて、ユーザ端末100の近傍の物体1010を検出した検出結果を、ユーザの入力操作として受け付ける例をさらに説明する。カメラ17および測距センサ18は、例えば、ユーザ端末100の筐体の側面に設けられてもよい。カメラ17の近傍に測距センサ18が設けられてもよい。カメラ17としては、例えば赤外線カメラを用いることができる。この場合、赤外線を照射する照明装置および可視光を遮断するフィルタ等が、カメラ17に設けられてもよい。これにより、屋外か屋内かにかかわらず、カメラ17の撮影画像に基づく物体の検出精度をいっそう向上させることができる。 Here, an example in which the user terminal 100 receives a detection result of detecting an object 1010 in the vicinity of the user terminal 100 using the camera 17 and the distance measuring sensor 18 as a user input operation will be further described. The camera 17 and the distance measuring sensor 18 may be provided on the side surface of the housing of the user terminal 100, for example. A distance measuring sensor 18 may be provided in the vicinity of the camera 17. As the camera 17, for example, an infrared camera can be used. In this case, the camera 17 may be provided with an illumination device that emits infrared light, a filter that blocks visible light, and the like. Thereby, it is possible to further improve the detection accuracy of the object based on the captured image of the camera 17 regardless of whether it is outdoors or indoors.

 プロセッサ10は、カメラ17の撮影画像に対して、例えば以下の(1)~(5)に示す処理のうち1つ以上の処理を行ってもよい。(1)プロセッサ10は、カメラ17の撮影画像に対し画像認識処理を行うことで、該撮影画像にユーザの手が含まれているか否かを特定する。プロセッサ10は、上述の画像認識処理において採用する解析技術として、例えばパターンマッチング等の技術を用いてよい。(2)また、プロセッサ10は、ユーザの手の形状から、ユーザのジェスチャを検出する。プロセッサ10は、例えば、撮影画像から検出されるユーザの手の形状から、ユーザの指の本数(伸びている指の本数)を特定する。プロセッサ10はさらに、特定した指の本数から、ユーザが行ったジェスチャを特定する。例えば、プロセッサ10は、指の本数が5本である場合、ユーザが「パー」のジェスチャを行ったと判定する。また、プロセッサ10は、指の本数が0本である(指が検出されなかった)場合、ユーザが「グー」のジェスチャを行ったと判定する。また、プロセッサ10は、指の本数が2本である場合、ユーザが「チョキ」のジェスチャを行ったと判定する。(3)プロセッサ10は、カメラ17の撮影画像に対し、画像認識処理を行うことにより、ユーザの指が人差し指のみ立てた状態であるか、ユーザの指がはじくような動きをしたかを検出する。(4)プロセッサ10は、カメラ17の撮影画像の画像認識結果、および、測距センサ18の出力値等の少なくともいずれか1つに基づいて、ユーザ端末100の近傍の物体1010(ユーザの手など)とユーザ端末100との距離を検出する。例えば、プロセッサ10は、カメラ17の撮影画像から特定されるユーザの手の形状の大小により、ユーザの手がユーザ端末100の近傍(例えば所定値未満の距離)にあるのか、遠く(例えば所定値以上の距離)にあるのかを検出する。なお、撮影画像が動画の場合、プロセッサ10は、ユーザの手がユーザ端末100に接近しているのか遠ざかっているのかを検出してもよい。(5)カメラ17の撮影画像の画像認識結果等に基づいて、ユーザの手が検出されている状態で、ユーザ端末100とユーザの手との距離が変化していることが判明した場合、プロセッサ10は、ユーザが手をカメラ17の撮影方向において振っていると認識する。カメラ17の撮影範囲よりも指向性が強い測距センサ18において、物体が検出されたりされなかったりする場合に、プロセッサ10は、ユーザが手をカメラの撮影方向に直交する方向に振っていると認識する。 The processor 10 may perform, for example, one or more of the following processes (1) to (5) on the image captured by the camera 17. (1) The processor 10 performs image recognition processing on the captured image of the camera 17 to identify whether the captured image includes a user's hand. The processor 10 may use a technique such as pattern matching as an analysis technique employed in the above-described image recognition processing. (2) Moreover, the processor 10 detects a user's gesture from the shape of a user's hand. For example, the processor 10 specifies the number of the user's fingers (the number of fingers extending) from the shape of the user's hand detected from the captured image. The processor 10 further identifies a gesture performed by the user from the number of identified fingers. For example, when the number of fingers is five, the processor 10 determines that the user has performed a “par” gesture. In addition, when the number of fingers is 0 (no finger is detected), the processor 10 determines that the user has made a “goo” gesture. On the other hand, when the number of fingers is two, the processor 10 determines that the user has performed a “choke” gesture. (3) The processor 10 performs image recognition processing on the image captured by the camera 17 to detect whether the user's finger is in the state where only the index finger is raised or whether the user's finger has moved. . (4) The processor 10 determines an object 1010 (such as a user's hand) in the vicinity of the user terminal 100 based on at least one of the image recognition result of the image captured by the camera 17 and the output value of the distance measuring sensor 18. ) And the user terminal 100 are detected. For example, the processor 10 determines whether the user's hand is near (for example, a distance less than a predetermined value) or far (for example, a predetermined value) depending on the size of the shape of the user's hand specified from the captured image of the camera 17. It is detected whether the distance is above. When the captured image is a moving image, the processor 10 may detect whether the user's hand is approaching or moving away from the user terminal 100. (5) If it is determined that the distance between the user terminal 100 and the user's hand has changed while the user's hand is detected based on the image recognition result of the captured image of the camera 17, the processor 10 recognizes that the user is shaking his / her hand in the shooting direction of the camera 17. When an object is not detected in the distance measuring sensor 18 having higher directivity than the shooting range of the camera 17, the processor 10 indicates that the user is shaking his / her hand in a direction perpendicular to the shooting direction of the camera. recognize.

 このように、プロセッサ10は、カメラ17の撮影画像に対する画像認識により、ユーザが手を握りこんでいるか否か(「グー」のジェスチャであるか、それ以外のジェスチャ(例えば「パー」)であるか)を検出する。また、プロセッサ10は、ユーザの手の形状とともに、ユーザがこの手をどのように移動させているかを検出する。また、プロセッサ10は、ユーザがこの手をユーザ端末100に対して接近させているのか遠ざけているのかを検出する。このような操作は、例えば、マウスまたはタッチパネルなどのポインティングデバイスを用いた操作に対応させることができる。ユーザ端末100は、例えば、ユーザの手の移動に応じて、タッチスクリーン15においてポインタを移動させ、ユーザのジェスチャ「グー」を検出する。この場合、ユーザ端末100は、ユーザが選択操作を継続中であると認識する。選択操作の継続とは、例えば、マウスがクリックされて押し込まれた状態が維持されること、または、タッチパネルに対してタッチダウン操作がなされた後タッチされた状態が維持されることに対応する。また、ユーザ端末100は、ユーザのジェスチャ「グー」が検出されている状態で、さらにユーザが手を移動させると、このような一連のジェスチャを、スワイプ操作(またはドラッグ操作)に対応する操作として認識することもできる。また、ユーザ端末100は、カメラ17の撮影画像によるユーザの手の検出結果に基づいて、ユーザが指をはじくようなジェスチャを検出した場合に、当該ジェスチャを、マウスのクリックまたはタッチパネルへのタップ操作に対応する操作として認識してもよい。 In this manner, the processor 10 determines whether or not the user is grasping his / her hand by image recognition on the captured image of the camera 17 (“Goo” gesture or other gesture (eg “Par”)). ) Is detected. The processor 10 also detects how the user is moving the hand along with the shape of the user's hand. In addition, the processor 10 detects whether the user is approaching or moving away from the user terminal 100. Such an operation can correspond to an operation using a pointing device such as a mouse or a touch panel. For example, the user terminal 100 moves the pointer on the touch screen 15 according to the movement of the user's hand, and detects the user's gesture “goo”. In this case, the user terminal 100 recognizes that the user is continuing the selection operation. Continuation of the selection operation corresponds to, for example, maintaining a state where the mouse is clicked and pressed, or maintaining a touched state after a touchdown operation is performed on the touch panel. Further, when the user further moves his / her hand while the user's gesture “go” is detected, the user terminal 100 performs such a series of gestures as an operation corresponding to the swipe operation (or drag operation). It can also be recognized. In addition, when the user terminal 100 detects a gesture that the user repels a finger based on the detection result of the user's hand based on the captured image of the camera 17, the user terminal 100 clicks the gesture or taps the touch panel. You may recognize as operation corresponding to.

<第1の形態>
 図2は、ユーザ端末100の機能的構成を示す図である。図2を用いてユーザ端末100の機能的構成について説明する。ユーザ端末100は、プロセッサ10、メモリ11、ストレージ12、通信IF13、入出力IF14等の協働により、制御部110及び記憶部120として機能し得る。記憶部120に格納されたゲームプログラムは主記憶上に展開され、制御部110において実行される。制御部110は、当該ゲームプログラムによって、入力操作受付部111、カメラ配置制御部112、表示制御部113およびオブジェクト制御部114として機能し得る。なお、主記憶上には、制御部110が当該プログラムに従って動作している間に生成した各種ゲームデータおよび制御部110によって利用される各種ゲームデータも一時的に格納される。記憶部120には、制御部110が前記各部として機能するために必要なデータが記憶されている。当該データとしては、例えば、ゲームプログラム、ゲーム情報、ユーザ情報が含まれる。ゲーム情報としては、オブジェクト管理テーブル、スキル管理テーブル、基準モーションデータ121、コンボデータ122、履歴情報テーブル(後述)、ユーザ管理テーブルが挙げられる。オブジェクト管理テーブルは、各種ゲームオブジェクトを管理するためのテーブルである。スキル管理テーブルは、ゲームキャラクタの各種スキルを管理するためのテーブルである。基準モーションデータ121は、各ゲームキャラクタのモーションを定義するためのテーブルである。コンボデータ122は、所定の条件で発生するコンボによってキャラクタに実行させる攻撃動作の内容(例えば、攻撃力、モーション等)を管理するためのテーブルである。履歴情報テーブルは、入力部151が検知した接触の位置を示す情報を複数含むテーブルである。ユーザ管理テーブルは、ゲームキャラクタに実行させたアクションの履歴等を含むテーブルである。
<First form>
FIG. 2 is a diagram illustrating a functional configuration of the user terminal 100. A functional configuration of the user terminal 100 will be described with reference to FIG. The user terminal 100 can function as the control unit 110 and the storage unit 120 through the cooperation of the processor 10, the memory 11, the storage 12, the communication IF 13, the input / output IF 14, and the like. The game program stored in the storage unit 120 is developed on the main memory and executed by the control unit 110. The control unit 110 can function as the input operation receiving unit 111, the camera arrangement control unit 112, the display control unit 113, and the object control unit 114 according to the game program. In the main memory, various game data generated while the control unit 110 is operating according to the program and various game data used by the control unit 110 are also temporarily stored. The storage unit 120 stores data necessary for the control unit 110 to function as each unit. Examples of the data include game programs, game information, and user information. The game information includes an object management table, a skill management table, reference motion data 121, combo data 122, a history information table (described later), and a user management table. The object management table is a table for managing various game objects. The skill management table is a table for managing various skills of game characters. The reference motion data 121 is a table for defining the motion of each game character. The combo data 122 is a table for managing the contents (for example, attack power, motion, etc.) of the attacking action that is performed by the character by the combo generated under a predetermined condition. The history information table is a table including a plurality of pieces of information indicating contact positions detected by the input unit 151. The user management table is a table including a history of actions executed by the game character.

 制御部110は、ユーザ端末100全体の動作を制御し、各要素間におけるデータの送受信、ゲームの実行に必要な演算処理その他の処理を行う。制御部110は、例えば、入力操作受付部111で検知された入力操作に基づいてゲームプログラムに従ったゲームを展開させ、その結果を示すゲーム画像を描画する。制御部110は、ゲームサーバ200から受信したユーザ情報、ゲームプログラムによる演算結果、入力操作受付部111で検知された入力操作に基づいてゲーム空間内におけるオブジェクトを操作する。オブジェクトとしては、例えば、キャラクタオブジェクト(ゲームキャラクタ)、対象オブジェクト(他のオブジェクト)が挙げられる。キャラクタオブジェクトとはゲームにおける、ユーザによる操作対象のオブジェクトである。また、対象オブジェクトとは、キャラクタオブジェクトが作用を及ぼす対象のオブジェクトである。また、制御部110は、ゲーム空間を撮影する仮想カメラの撮影画像を生成する。また、制御部110は、タッチスクリーン15に対する入力操作及び演算処理の結果等に基づいて、記憶部120に記憶されている各種データの更新等の処理を行なう。また、制御部110は、記憶部120に記憶された各種ユーザ情報及びゲーム情報を参照し、ゲーム進行に必要な各種判定を実行する。 The control unit 110 controls the operation of the entire user terminal 100, and performs data transmission / reception between elements, arithmetic processing necessary for game execution, and other processing. For example, the control unit 110 develops a game according to the game program based on the input operation detected by the input operation reception unit 111, and draws a game image indicating the result. The control unit 110 operates an object in the game space based on the user information received from the game server 200, the calculation result of the game program, and the input operation detected by the input operation receiving unit 111. Examples of the object include a character object (game character) and a target object (other object). A character object is an object to be operated by a user in a game. The target object is a target object on which the character object acts. In addition, the control unit 110 generates a captured image of a virtual camera that captures the game space. Further, the control unit 110 performs processing such as updating various data stored in the storage unit 120 based on the input operation on the touch screen 15 and the result of the arithmetic processing. In addition, the control unit 110 refers to various user information and game information stored in the storage unit 120 and executes various determinations necessary for the game progress.

 入力操作受付部111は、入力部151に対するユーザの入力操作の種類を検知する。入力操作受付部111は、入力部151及びその他の入出力IF14を介したコンソールによる操作指示等から、いかなる操作がなされたかを判別する。入力操作受付部111は、判別した結果をカメラ配置制御部112等の必要な要素に出力する。入力操作受付部111が判別する入力操作の種類として、例えば、第1の入力操作、第2の入力操作が挙げられる。第1の入力操作は、タッチスクリーン15に対するタッチ位置を或る位置(第1の位置)から或る位置(第2の位置)へ移動させることにより入力される操作である。第1の入力操作としては、例えば、フリック操作、スワイプ操作が挙げられる。入力操作受付部111は、例えば、ユーザがタッチスクリーン15に対する第1の位置から或る位置へ、所定の時間よりも短い時間で移動させた後、接触を解除することにより入力される操作を「フリック操作」とする。また、入力操作受付部111は、タッチスクリーン15に対するタッチ位置を或る位置(第3の位置)から或る位置(第4の位置)へ移動させた後、タッチ位置を移動させた後の位置に維持させることにより入力される操作(第3の入力操作)を、「スワイプ操作」としてもよい。第2の入力操作は、タッチスクリーン15に対するタッチ位置を変化させずに入力される操作である。第2の入力操作としては、例えば、タップ操作が挙げられる。 The input operation reception unit 111 detects the type of user input operation on the input unit 151. The input operation reception unit 111 determines what operation has been performed from an operation instruction or the like by the console via the input unit 151 and other input / output IFs 14. The input operation reception unit 111 outputs the determined result to necessary elements such as the camera arrangement control unit 112. Examples of the types of input operations that are determined by the input operation receiving unit 111 include a first input operation and a second input operation. The first input operation is an operation input by moving the touch position on the touch screen 15 from a certain position (first position) to a certain position (second position). Examples of the first input operation include a flick operation and a swipe operation. For example, the input operation receiving unit 111 moves an operation input by releasing the contact after moving the user from the first position to the touch screen 15 to a certain position in a time shorter than a predetermined time. "Flick operation". The input operation reception unit 111 moves the touch position on the touch screen 15 from a certain position (third position) to a certain position (fourth position), and then moves the touch position. The operation (third input operation) input by maintaining the above may be a “swipe operation”. The second input operation is an operation that is input without changing the touch position on the touch screen 15. An example of the second input operation is a tap operation.

 入力操作受付部111は、タッチスクリーン15に対する物体の接触を検出しない状態から接触を検出したとき、「タッチオン状態」になったと判別する。また、入力操作受付部111は、タッチスクリーン15に対する物体の接触を検出していないとき、「タッチオフ状態」になったと判別する。また、入力操作受付部111は、逐次タッチスクリーン15に対するタッチ位置を示す履歴情報を、「タッチナウ状態」の履歴情報として受け付ける。 The input operation accepting unit 111 determines that the state is the “touch-on state” when the contact is detected from the state where the contact of the object with the touch screen 15 is not detected. In addition, the input operation reception unit 111 determines that the “touch-off state” has occurred when the contact of the object with the touch screen 15 is not detected. Further, the input operation accepting unit 111 sequentially accepts history information indicating a touch position on the touch screen 15 as history information of the “touch now state”.

 図3は、入力操作の種類を検知するために入力操作受付部111が参照する履歴情報テーブルの一例を示す図である。入力操作受付部111が入力操作の種類を判別する方法の一例を、図3を用いて説明する。図3では、配列fp[0]~配列fp[10]までの11個の配列のそれぞれに、入力部151が検知したタッチスクリーン15上の位置を示す履歴情報が格納されている。履歴情報は、所定の期間毎(例えば、フレームレート毎)に履歴情報テーブルに格納される。履歴情報が格納される配列の個数は限定されず、任意の個数であってよい。また、履歴情報テーブルでは、タッチオフからタッチオンになった場合に検知された履歴情報を、初期位置座標として記憶部120に構成される構成が好ましい。 FIG. 3 is a diagram illustrating an example of a history information table referred to by the input operation receiving unit 111 in order to detect the type of input operation. An example of a method for the input operation receiving unit 111 to determine the type of input operation will be described with reference to FIG. In FIG. 3, history information indicating the position on the touch screen 15 detected by the input unit 151 is stored in each of the 11 arrays from array fp [0] to array fp [10]. The history information is stored in the history information table every predetermined period (for example, every frame rate). The number of arrays in which history information is stored is not limited and may be any number. In the history information table, it is preferable that history information detected when touch-on is performed from touch-off is configured in the storage unit 120 as initial position coordinates.

 図3のテーブルにおいて、例えば、配列fp[0]~配列fp[9]に、履歴情報(x0、y0)が格納されており、配列fp[10]にnull値が格納された場合、入力操作受付部111は、入力操作はタップ操作であると判別する。また、例えば、タッチナウ状態において履歴情報が変化した後に、null値が格納された場合、入力操作受付部111はnull値が格納された配列fp[5]の直前の配列fp[3]および配列fp[4]に格納されている履歴情報を参照する。そして、入力操作受付部111は、配列fp[3]および配列fp[4]の履歴情報がそれぞれ示す位置の間の距離が予め設定された閾値以上である場合、入力操作はフリック操作であると判別する。また、入力操作受付部111は、タッチナウ状態において履歴情報が変化した後に、例えば配列fp[4]~fp[10]に履歴情報(x15、y15)が格納された場合、入力操作はスワイプ操作であると判別する。 In the table of FIG. 3, for example, when the history information (x0, y0) is stored in the array fp [0] to the array fp [9] and the null value is stored in the array fp [10], an input operation is performed. The accepting unit 111 determines that the input operation is a tap operation. For example, when the null value is stored after the history information has changed in the touch now state, the input operation reception unit 111 sets the array fp [3] and the array fp immediately before the array fp [5] in which the null value is stored. Refer to the history information stored in [4]. Then, the input operation reception unit 111 determines that the input operation is a flick operation when the distance between the positions indicated by the history information of the arrays fp [3] and fp [4] is greater than or equal to a preset threshold value. Determine. Further, after the history information changes in the touch now state, for example, when the history information (x15, y15) is stored in the arrays fp [4] to fp [10], the input operation accepting unit 111 performs the swipe operation. Determine that there is.

 カメラ配置制御部112は、ゲーム空間を撮影する仮想カメラの配置を制御する。カメラ配置制御部112は、記憶部120に記憶されたユーザ情報、ゲームプログラムによる演算結果、および入力操作受付部111が検知した操作の種類に基づいて、ゲーム空間の視野を指定するための仮想カメラの配置を制御する。カメラ配置制御部112は、中央付近にキャラクタオブジェクトが表示されるように、仮想カメラを配置する構成が好ましい。カメラ配置制御部112は、仮想カメラの撮影画像を、表示制御部113に出力する。 The camera arrangement control unit 112 controls the arrangement of virtual cameras that shoot the game space. The camera arrangement control unit 112 is a virtual camera for designating the visual field of the game space based on the user information stored in the storage unit 120, the calculation result of the game program, and the type of operation detected by the input operation receiving unit 111 Control the placement of The camera arrangement control unit 112 preferably has a configuration in which a virtual camera is arranged so that a character object is displayed near the center. The camera arrangement control unit 112 outputs the captured image of the virtual camera to the display control unit 113.

 表示制御部113は、仮想カメラの撮影画像を、表示部152に表示する。また、表示制御部113は、オブジェクト制御部114の指示により、オブジェクトを表示部152に表示させる。 The display control unit 113 displays the captured image of the virtual camera on the display unit 152. Further, the display control unit 113 causes the display unit 152 to display the object in accordance with an instruction from the object control unit 114.

 オブジェクト制御部114は、ユーザによる入力操作、および/または、ゲームプログラムに基づいて、キャラクタオブジェクト、対象オブジェクト等のオブジェクトを制御する。例えば、オブジェクト制御部114は、キャラクタオブジェクトを移動させる制御、対象オブジェクトに対して作用を及ぼすためのアクションをキャラクタオブジェクトに実行させる制御等を行なう。また、オブジェクト制御部114は、入力操作受付部111が受け付けた入力操作に応じたオブジェクトを表示部152に表示するよう表示制御部113に指示する。 The object control unit 114 controls objects such as a character object and a target object based on an input operation by a user and / or a game program. For example, the object control unit 114 performs control for moving the character object, control for causing the character object to perform an action for acting on the target object, and the like. Further, the object control unit 114 instructs the display control unit 113 to display an object according to the input operation received by the input operation receiving unit 111 on the display unit 152.

 <ゲームシステムの処理の態様例>
 次に、ユーザ端末100が、ユーザからの入力操作に応じて様々なアクションをキャラクタC(キャラクタオブジェクト)に実行させるゲームについて図4および5を用いて説明する。図4は、ゲームシステム1によって実行される処理の流れを示すフローチャートの一例である。図5は、表示制御部113がタッチスクリーン15に表示させるゲーム空間画像の一例を模式的に示す図である。キャラクタCに実行させるアクションとしては、例えば、対象オブジェクトに作用を及ぼすためのアクション、移動するためのアクション等が挙げられる。作用としては、例えば、攻撃が挙げられる。対象オブジェクトに作用を及ぼすためのアクションとしては、例えば、攻撃動作が挙げられる。また、対象オブジェクトの種類としては、例えば、キャラクタCに対して攻撃動作等を行なう敵キャラクタオブジェクト、図5に示す扉Oなどの障害物のオブジェクトが挙げられる。また、移動とは、ゲーム空間におけるキャラクタCの位置を変更するためのアクションである。移動するためのアクションとしては、例えば、歩行、走行、他のオブジェクトまたは他のオブジェクトによる攻撃動作を避けるための回避動作が挙げられる。
<Examples of Game System Processing>
Next, a game in which the user terminal 100 causes the character C (character object) to execute various actions in accordance with an input operation from the user will be described with reference to FIGS. FIG. 4 is an example of a flowchart showing a flow of processing executed by the game system 1. FIG. 5 is a diagram schematically illustrating an example of a game space image displayed on the touch screen 15 by the display control unit 113. As an action to be executed by the character C, for example, an action for acting on a target object, an action for moving, and the like can be mentioned. Examples of the action include an attack. As an action for exerting an action on the target object, for example, an attack action can be cited. Examples of the target object include an enemy character object that performs an attacking action on the character C, and an obstacle object such as a door O shown in FIG. Moreover, the movement is an action for changing the position of the character C in the game space. Examples of the action for moving include walking, running, and avoiding action for avoiding attacking action by other objects or other objects.

 ステップS101において、入力操作受付部111がゲームを開始する旨の操作を受け付けると、制御部110はゲームを開始する。次に、カメラ配置制御部112は、ゲーム情報等を参照してゲーム空間における仮想カメラの配置を制御する。ステップS102において、表示制御部113は、仮想カメラが撮影したゲーム空間画像をタッチスクリーン15に表示する。 In step S101, when the input operation receiving unit 111 receives an operation to start the game, the control unit 110 starts the game. Next, the camera arrangement control unit 112 controls the arrangement of virtual cameras in the game space with reference to game information and the like. In step S <b> 102, the display control unit 113 displays the game space image captured by the virtual camera on the touch screen 15.

 ステップS103において、入力操作受付部111は、タッチスクリーン15に対する入力操作を受け付けたか否かを判定する。ステップS103においてNOの場合、入力操作受付部111はステップS103の処理を繰り返す。 In step S103, the input operation reception unit 111 determines whether an input operation on the touch screen 15 has been received. If NO in step S103, the input operation accepting unit 111 repeats the process in step S103.

 ステップS103においてYESの場合、ステップS104において入力操作受付部111は、ステップS103において受け付けた入力操作が、第1の入力操作であるか否かを判定する。以下では、第1の入力操作がフリック操作である場合について説明する。図5に示す画像において、第1の位置は位置L1であり、第2の位置は位置L2である。入力操作受付部111は、タッチスクリーン15に対するフリック操作を受け付けた場合に、位置L1と位置L2とにより定まるタッチ操作の方向を特定する。 If YES in step S103, in step S104, the input operation accepting unit 111 determines whether or not the input operation accepted in step S103 is the first input operation. Hereinafter, a case where the first input operation is a flick operation will be described. In the image shown in FIG. 5, the first position is the position L1, and the second position is the position L2. When the input operation reception unit 111 receives a flick operation on the touch screen 15, the input operation reception unit 111 specifies the direction of the touch operation determined by the position L1 and the position L2.

 ステップS104においてYESの場合、ステップS105において表示制御部113は、タッチ操作の方向を示すオブジェクトをタッチスクリーン15に表示する。例えば表示制御部113は、図5に示すように、位置L1から位置L2へ伸びるオブジェクト等をタッチスクリーン15に表示する構成であってもよい。本実施形態では、まず、入力操作受付部111が位置L1にてタッチオンを受け付けると、表示制御部113は位置L1を中心とする弾性オブジェクトE1をタッチスクリーン15に表示させる。次に、入力操作受付部111がタッチナウの状態でタッチ位置が位置L2へ移動する操作を受け付けると、表示制御部113は、弾性オブジェクトE1をタッチ操作の方向側に引き伸ばすように変形させた弾性オブジェクトE2をタッチスクリーン15に表示させる。 If YES in step S104, the display control unit 113 displays an object indicating the direction of the touch operation on the touch screen 15 in step S105. For example, the display control unit 113 may be configured to display an object or the like extending from the position L1 to the position L2 on the touch screen 15, as shown in FIG. In the present embodiment, first, when the input operation accepting unit 111 accepts a touch-on at the position L1, the display control unit 113 displays the elastic object E1 centered on the position L1 on the touch screen 15. Next, when the input operation accepting unit 111 accepts an operation of moving the touch position to the position L2 in the state of the touch now, the display control unit 113 deforms the elastic object E1 so as to extend in the direction of the touch operation. E2 is displayed on the touch screen 15.

 ステップS106において、オブジェクト制御部114は、キャラクタCが向いている方向を特定する。例えば記憶部120には、キャラクタCの顔が向いている方向または直前の移動操作の進行方向等を示す情報が記憶されている。オブジェクト制御部114は、キャラクタCに実行させたアクション、他のオブジェクトからの作用等に基づいて、前記情報が示す方向の内、どれをキャラクタCが向いている方向とするかを選択する。次に、オブジェクト制御部114は、選択した結果に基づき、記憶部120を参照してキャラクタCの向いている方向を特定する。 In step S106, the object control unit 114 specifies the direction in which the character C is facing. For example, the storage unit 120 stores information indicating the direction in which the face of the character C is facing or the traveling direction of the last moving operation. The object control unit 114 selects which of the directions indicated by the information is the direction in which the character C is facing based on the action performed by the character C, the action from another object, and the like. Next, the object control unit 114 refers to the storage unit 120 based on the selected result and identifies the direction in which the character C is facing.

 ステップS107においてオブジェクト制御部114は、タッチ操作の方向と、タッチスクリーン15に表示される画面においてキャラクタCが向いている方向とを比較する。 In step S107, the object control unit 114 compares the direction of the touch operation with the direction in which the character C is facing on the screen displayed on the touch screen 15.

 ステップS108においてオブジェクト制御部114は、ステップS107における比較結果に応じたアクションを、キャラクタCに実行させる。例えば、オブジェクト制御部114は、タッチ操作の方向が、キャラクタCが向いている方向により定まる一定範囲の方向に含まれていれば、攻撃動作をキャラクタCに実行させる。一定範囲の方向は特に限定されないが、キャラクタCが向いている方向を軸として、当該軸から左右一定の範囲(例えば、左右それぞれに30度以内、45度以内等)の方向としてもよい。 In step S108, the object control unit 114 causes the character C to perform an action corresponding to the comparison result in step S107. For example, if the direction of the touch operation is included in a certain range of directions determined by the direction in which the character C is facing, the object control unit 114 causes the character C to perform an attack action. The direction of the certain range is not particularly limited, but the direction in which the character C faces may be set as the axis and the direction may be a certain range on the left and right from the axis (for example, within 30 degrees, 45 degrees, etc. on the left and right respectively).

 また、ステップS108においてオブジェクト制御部114は、タッチ操作の方向が一定範囲の方向に含まれていなければ、キャラクタCにタッチ操作の方向へ移動する回避動作を実行させ、攻撃動作を実行させない。ステップS108の後、ステップS103に戻る。 In step S108, if the direction of the touch operation is not included in a certain range, the object control unit 114 causes the character C to perform an avoidance operation that moves in the direction of the touch operation, and does not execute an attack operation. After step S108, the process returns to step S103.

 また、ステップS104においてNOの場合、ステップS109においてオブジェクト制御部114は、入力操作に応じたアクションをキャラクタCに実行させる。例えば、入力操作がタップ操作であった場合、オブジェクト制御部114は、キャラクタCに攻撃動作を実行させる。タップ操作に応じた攻撃動作は、ステップS108においてキャラクタCに実行させた攻撃動作とは異なっていることが好ましい。この構成により、多様な操作によってキャラクタCに多様な攻撃動作を実行させることができるので、ゲームの趣向性が向上する。ステップS109の後、ステップS103に戻る。 If NO in step S104, the object control unit 114 causes the character C to execute an action corresponding to the input operation in step S109. For example, when the input operation is a tap operation, the object control unit 114 causes the character C to perform an attack action. The attack action according to the tap operation is preferably different from the attack action executed by the character C in step S108. With this configuration, the character C can be made to perform various attack actions by various operations, so that the game preference is improved. After step S109, the process returns to step S103.

 このように、オブジェクト制御部114は、キャラクタCが向いている方向とタッチ操作の方向とに応じて、キャラクタCに攻撃動作を実行させたり回避動作を実行させたりするなど、アクションを分けることができるので、ゲームの趣向性が向上する。特に、攻撃対象に対する入力操作に多様性を持たせることができる。 As described above, the object control unit 114 can divide actions such as causing the character C to perform an attacking action or an avoiding action depending on the direction in which the character C is facing and the direction of the touch operation. Since it can, the taste of the game is improved. In particular, it is possible to give diversity to the input operation for the attack target.

 また、ステップS105の処理により、表示制御部113は、タッチ操作の方向を可視化させることができる。これにより、ユーザはキャラクタCに攻撃動作をさせたいときに、キャラクタCが向いている方向とタッチ操作の方向とを両方とも視認できる。従って、ユーザはキャラクタCが向いている方向を参考にしながら、所望の方向にタッチ位置を移動させ易い。 In addition, the display control unit 113 can visualize the direction of the touch operation by the process of step S105. Thereby, when the user wants the character C to perform an attacking action, the user can visually recognize both the direction in which the character C is facing and the direction of the touch operation. Therefore, the user can easily move the touch position in a desired direction while referring to the direction in which the character C is facing.

 また、フリック操作は、ゲーム中の素早い操作が必要とされる場面において、ユーザが直感的に素早く入力する操作である。本実施形態では、フリック操作の方向に応じて攻撃動作または回避動作を選択できるため、直感的に攻撃と回避とを選択することができる。よって、ゲームの趣向性が向上する。 Also, the flick operation is an operation that the user inputs intuitively and quickly in a scene that requires a quick operation during the game. In the present embodiment, since the attack action or the avoidance action can be selected according to the direction of the flick operation, the attack and the avoidance can be selected intuitively. Therefore, the game preference is improved.

 (変形例1)
 以上の説明では、第1の入力操作がフリック操作である場合について説明したが、第1の入力操作がスワイプ操作である場合に、同様の処理を行なってもよい。つまり、入力操作受付部111が、一定範囲の方向に含まれる方向のスワイプ操作を受け付けた場合、オブジェクト制御部114はキャラクタCに攻撃動作を実行させてもよい。また、入力操作受付部111が、一定範囲外の方向のスワイプ操作を受け付けた場合、オブジェクト制御部114はキャラクタCに移動動作を実行させてもよい。
(Modification 1)
Although the case where the first input operation is a flick operation has been described above, the same processing may be performed when the first input operation is a swipe operation. That is, when the input operation reception unit 111 receives a swipe operation in a direction included in a certain range of directions, the object control unit 114 may cause the character C to perform an attack action. Further, when the input operation accepting unit 111 accepts a swipe operation in a direction outside a certain range, the object control unit 114 may cause the character C to perform a moving action.

 (変形例2)
 制御部110は、入力操作受付部111が第1の入力操作を受け付けた場合、第1の入力操作がフリック操作であるか、スワイプ操作であるかに応じて、キャラクタCに実行させるアクションを変えてもよい。
(Modification 2)
When the input operation receiving unit 111 receives the first input operation, the control unit 110 changes the action to be performed by the character C depending on whether the first input operation is a flick operation or a swipe operation. May be.

 例えば、制御部110は、入力操作受付部111が受け付けた操作がフリック操作である場合には、ステップS105~ステップS108を実行する。一方、制御部110は、入力操作受付部111が受け付けた操作がスワイプ操作である場合には、ステップS105~ステップS108に替えて、以下の処理を実行する。 For example, when the operation received by the input operation receiving unit 111 is a flick operation, the control unit 110 executes Step S105 to Step S108. On the other hand, when the operation accepted by the input operation accepting unit 111 is a swipe operation, the control unit 110 executes the following process instead of step S105 to step S108.

 入力操作受付部111がスワイプ操作を受け付けた場合に、表示制御部113は、スワイプ操作における第3の位置と第4の位置とにより定まるタッチ操作の方向を示すオブジェクトをタッチスクリーン15に表示する。例えば、第3の位置および第4の位置がそれぞれ図5に示す位置L1および位置L2である場合、表示制御部113は、図5に示す弾性オブジェクトE2と同様の弾性オブジェクトをタッチスクリーン15に表示する。 When the input operation accepting unit 111 accepts the swipe operation, the display control unit 113 displays an object indicating the direction of the touch operation determined by the third position and the fourth position in the swipe operation on the touch screen 15. For example, when the third position and the fourth position are the position L1 and the position L2 shown in FIG. 5, respectively, the display control unit 113 displays an elastic object similar to the elastic object E2 shown in FIG. To do.

 次にオブジェクト制御部114は、キャラクタCをタッチ操作の方向に移動するためのアクションを実行させる。例えば、タッチ操作の方向が右方向の場合、オブジェクト制御部114は、右方向に移動するためのアクションをキャラクタCに実行させる。スワイプ操作は長時間タッチスクリーン15に接触し続ける操作であるため、キャラクタCが俊敏に動作する攻撃動作、回避動作等より、移動の方がユーザに操作内容とキャラクタCの反応とを直感的に認識させやすい。よって、ゲームの趣向性をより向上させることができる。 Next, the object control unit 114 executes an action for moving the character C in the direction of the touch operation. For example, when the direction of the touch operation is the right direction, the object control unit 114 causes the character C to perform an action for moving in the right direction. Since the swipe operation is an operation that keeps touching the touch screen 15 for a long time, it is more intuitive for the user to move the operation content and the response of the character C than the attack operation and the avoidance operation in which the character C operates quickly. Easy to recognize. Therefore, the game preference can be further improved.

 また、オブジェクト制御部114は、キャラクタCが向いている方向を、スワイプ操作に対応するタッチ操作の方向に基づいて定まる方向に変更する。これにより、直前の移動操作における進行方向がキャラクタCの向いている方向となる。ユーザは、自身が直前に行った移動操作の進行方向が、キャラクタCが向いている方向と認識することが多いと考えられる。そのため、ユーザが認識しているキャラクタCが向いている方向と、オブジェクト制御部114が設定するキャラクタCが向いている方向とが、一致する。よって、移動するためのアクションに続いて、ユーザがキャラクタCの向いている方向を基準に、攻撃させるための操作をしたり、他のアクションをさせるための操作をしたりする場合に、誤操作を発生し難くすることができる。 Further, the object control unit 114 changes the direction in which the character C is facing to a direction determined based on the direction of the touch operation corresponding to the swipe operation. Thereby, the advancing direction in the last moving operation is the direction in which the character C is facing. It is considered that the user often recognizes that the traveling direction of the moving operation performed immediately before is the direction in which the character C is facing. For this reason, the direction in which the character C recognized by the user is facing and the direction in which the character C set by the object control unit 114 is facing match. Therefore, if the user performs an operation for making an attack or an operation for making another action based on the direction in which the character C is facing after the action for moving, an erroneous operation is performed. It can be made difficult to occur.

 このように、或る位置から或る位置へ移動させることにより入力される入力操作であっても、フリック操作の場合とスワイプ操作の場合とでキャラクタCに実行させるアクションを変えることにより、より多様なアクションをキャラクタCに行わせることができる。 In this way, even if an input operation is input by moving from a certain position to a certain position, the action to be executed by the character C is changed depending on whether the operation is a flick operation or a swipe operation. Character C can be performed.

 また、表示制御部113は、タッチ操作の方向を示すオブジェクトをタッチスクリーン15に表示するので、スワイプ操作の方向を可視化させることができる。そのため、所望の方向にキャラクタCを移動させ易い。従って、ユーザビリティを向上させることができる。 Further, since the display control unit 113 displays an object indicating the direction of the touch operation on the touch screen 15, the direction of the swipe operation can be visualized. Therefore, it is easy to move the character C in a desired direction. Therefore, usability can be improved.

 (変形例3)
 入力操作受付部111が入力操作を所定の期間内に複数回受け付けた場合、オブジェクト制御部114が今回の入力操作を受け付けてキャラクタCに実行させるアクションは、前回以前の入力操作の少なくとも一部および今回の入力操作に関連付けられていてもよい。つまり、オブジェクト制御部114はキャラクタCに、いわゆるコンボ攻撃の動作を実行させてもよい。例えば、入力操作受付部111が所定の期間内にタップ操作、フリック操作の順で入力操作を受け付けた場合、オブジェクト制御部114がフリック操作に応じてキャラクタCに実行させるアクションは、前回がタップ操作であったことおよび今回がフリック操作であることに基づくアクションであってもよい。
(Modification 3)
When the input operation receiving unit 111 receives an input operation a plurality of times within a predetermined period, the action that the object control unit 114 receives the current input operation and causes the character C to execute is at least a part of the previous input operation and It may be associated with the current input operation. That is, the object control unit 114 may cause the character C to perform a so-called combo attack operation. For example, when the input operation reception unit 111 receives an input operation in the order of a tap operation and a flick operation within a predetermined period, the action that the object control unit 114 causes the character C to execute in response to the flick operation is the tap operation last time. And an action based on the fact that this time is a flick operation.

 図6は、ゲームシステム1によって実行される処理の流れを示すフローチャートの他の例である。ステップS101~ステップS107、およびステップS109の処理は、上述した処理と同じであるため、説明を繰り返さない。 FIG. 6 is another example of a flowchart showing a flow of processing executed by the game system 1. The processes in steps S101 to S107 and step S109 are the same as those described above, and thus description thereof will not be repeated.

 オブジェクト制御部114は、ステップS107を実行した後、比較結果に応じたアクションをキャラクタCに実行させる。まず、ステップS201においてオブジェクト制御部114は、タッチ操作の方向が、キャラクタCが向いている方向により定まる一定範囲の方向に含まれているか否かを判定する。 After executing step S107, the object control unit 114 causes the character C to perform an action according to the comparison result. First, in step S201, the object control unit 114 determines whether the direction of the touch operation is included in a certain range of directions determined by the direction in which the character C is facing.

 ステップS201においてYESの場合、ステップS202においてオブジェクト制御部114は、ステップS103においてフリック操作を受け付ける前の一定期間内に、攻撃動作を行なうためのタップ操作を受け付けたか否かを判定する。 In the case of YES in step S201, in step S202, the object control unit 114 determines whether or not a tap operation for performing an attack operation has been received within a certain period before the flick operation is received in step S103.

 ステップS202においてYESの場合、ステップS203において、オブジェクト制御部114は、前回の攻撃動作に対応する操作がタップ操作であり、且つ、今回の操作がフリック操作であることに基づく攻撃動作を、キャラクタCに実行させる。このように、連続してキャラクタCに攻撃動作を実行させる場合、毎回同じ攻撃動作をキャラクタCに実行させるのではなく、異なる攻撃動作をキャラクタCに実行させることができる。従って、攻撃動作が多様になる。ステップS203の後、ステップS103に戻る。 In the case of YES in step S202, in step S203, the object control unit 114 performs an attack action based on the fact that the operation corresponding to the previous attack action is a tap operation and the current operation is a flick operation. To run. As described above, when the character C is continuously made to perform the attack action, the character C can be made to execute different attack actions instead of causing the character C to execute the same attack action every time. Therefore, the attacking action becomes diverse. After step S203, the process returns to step S103.

 ステップS202においてNOの場合、ステップS204においてオブジェクト制御部114は、コンボデータ122を参照することなく、キャラクタCにフリック操作に基づく攻撃動作を実行させる。ステップS204の後、ステップS103に戻る。 If NO in step S202, the object control unit 114 causes the character C to perform an attack action based on the flick operation without referring to the combo data 122 in step S204. After step S204, the process returns to step S103.

 また、ステップS201においてNOの場合、ステップS205においてオブジェクト制御部114は、ステップS103において受け付けたフリック操作を受け付ける前の一定期間内にタップ操作を受け付けたか否かを判定する。 Further, in the case of NO in step S201, in step S205, the object control unit 114 determines whether or not a tap operation has been received within a certain period before the flick operation received in step S103 is received.

 ステップS205においてYESの場合、ステップS206においてオブジェクト制御部114は、キャラクタCに、攻撃動作を実行することなくタッチ操作の方向に移動するための回避動作を実行させる。それに続けて、ステップS207においてオブジェクト制御部114は、キャラクタCに攻撃動作を実行させる。ステップS207においてオブジェクト制御部114がキャラクタCに実行させる攻撃動作は、タップ操作に基づく攻撃動作とは異なってもよい。この場合、キャラクタCによる攻撃動作がより多様となり、ゲームの趣向性が向上する。ステップS207の後、ステップS103に戻る。 If YES in step S205, in step S206, the object control unit 114 causes the character C to execute an avoidance action for moving in the direction of the touch operation without executing an attack action. Subsequently, in step S207, the object control unit 114 causes the character C to perform an attack action. The attack action that the object control unit 114 causes the character C to execute in step S207 may be different from the attack action based on the tap operation. In this case, the attacking action by the character C becomes more diverse, and the game preference is improved. After step S207, the process returns to step S103.

 ステップS205においてNOの場合、ステップS208においてオブジェクト制御部114は、キャラクタCに回避動作を実行させ、攻撃動作を実行させない。ステップS208の後、ステップS103に戻る。 In the case of NO in step S205, in step S208, the object control unit 114 causes the character C to perform an avoidance action and not to perform an attack action. After step S208, the process returns to step S103.

 このように、ユーザは、第1の入力操作を入力する前の入力操作、および第1の入力操作の方向に応じて、キャラクタCに多様なアクションを実行させることができるため、趣向性がさらに向上する。 As described above, the user can cause the character C to perform various actions according to the input operation before inputting the first input operation and the direction of the first input operation. improves.

 また、ステップS104においてNOの場合、ステップS109において入力操作受付部111は、ステップS103において受け付けた入力操作がタップ操作であったか否かを判定してもよい。さらに、入力操作受付部111が、受け付けた入力操作はタップ操作であると判定した場合、当該タップ操作を受け付ける前の一定期間内にタップ操作を受け付けたか否かをさらに判定してもよい。ステップS109の後、ステップS103に戻る。 Further, in the case of NO in step S104, in step S109, the input operation receiving unit 111 may determine whether or not the input operation received in step S103 is a tap operation. Furthermore, when the input operation reception unit 111 determines that the received input operation is a tap operation, it may further determine whether or not the tap operation has been received within a certain period before the tap operation is received. After step S109, the process returns to step S103.

 また、例えば、ステップS103に戻った後、一定期間内に入力操作受付部111がフリック操作を受け付けて、ステップS202、S203が実行されることで、タップ操作、タップ操作、フリック操作という一連の入力操作に基づくコンボ攻撃が成立する。このコンボ攻撃においてキャラクタCが実行するアクションを、図7を用いて説明する。 Further, for example, after returning to step S103, the input operation reception unit 111 receives a flick operation within a certain period, and steps S202 and S203 are executed, so that a series of inputs such as a tap operation, a tap operation, and a flick operation are performed. A combo attack based on the operation is established. An action performed by the character C in this combo attack will be described with reference to FIG.

 図7は、オブジェクト制御部114がコンボデータ122に基づいてキャラクタに攻撃動作を実行させたときのアクションの一例を示す図である。図7の状態(A)は、アクションa1を示す。図7の状態(B)は、アクションa2を示す。図7の状態(C)は、アクションa3を示す。入力操作受付部111がタップ操作を受け付けると、オブジェクト制御部114はまず、図7の状態(A)に示すキャラクタCにアクションa1を実行させる。続いて、入力操作受付部111が一定期間内に再びタップ操作を受け付けると、オブジェクト制御部114は、図7の状態(B)に示すアクションa2をキャラクタCに実行させる。続いて、入力操作受付部111が一定期間内にフリック操作を受け付けると、オブジェクト制御部114は、図7の状態(C)に示すアクションa3をキャラクタCに実行させる。アクションa3は、アクションa1、a2より演出を派手にしたり、攻撃力を大きくしたりするように設定してもよい。また、アクションa3のためのフリック操作の代わりに、タップ操作が3回続けられることによって、オブジェクト制御部114は、他の演出が派手であったり攻撃力が大きかったりするアクションをキャラクタCに実行させてもよい。本開示の一実施形態によれば、タッチスクリーン15によるタッチ操作でコンボ攻撃をキャラクタCに実行させるゲームであっても、コンボの流れを分岐させることができる。 FIG. 7 is a diagram illustrating an example of an action when the object control unit 114 causes the character to perform an attack action based on the combo data 122. The state (A) in FIG. 7 shows action a1. The state (B) of FIG. 7 shows action a2. The state (C) of FIG. 7 shows action a3. When the input operation accepting unit 111 accepts the tap operation, the object control unit 114 first causes the character C shown in the state (A) of FIG. 7 to execute the action a1. Subsequently, when the input operation accepting unit 111 accepts the tap operation again within a certain period, the object control unit 114 causes the character C to execute the action a2 shown in the state (B) of FIG. Subsequently, when the input operation receiving unit 111 receives a flick operation within a certain period, the object control unit 114 causes the character C to execute an action a3 shown in the state (C) of FIG. The action a3 may be set so that the production is more flashy than the actions a1 and a2, or the attack power is increased. Further, instead of the flick operation for the action a3, the tap operation is continued three times, so that the object control unit 114 causes the character C to perform an action in which other effects are flashy or the attack power is large. May be. According to an embodiment of the present disclosure, even in a game in which the character C performs a combo attack by a touch operation using the touch screen 15, the flow of the combo can be branched.

 このように、ユーザは、多様な操作によって、次々と流れるようにキャラクタCに攻撃させることができるので、ゲームの趣向性が向上する。 Thus, since the user can cause the character C to attack one after another by various operations, the game preference is improved.

 また、図6に示すフローチャートが終了すると、制御部110は、ステップS103の処理に戻る。この構成により、入力操作受付部111が入力操作を所定の期間内に複数回受け付けた場合、オブジェクト制御部114は、3回目以降の入力操作に対しても、キャラクタCにコンボを実行させることができる。 Further, when the flowchart shown in FIG. 6 ends, the control unit 110 returns to the process of step S103. With this configuration, when the input operation receiving unit 111 receives an input operation a plurality of times within a predetermined period, the object control unit 114 can cause the character C to execute a combo for the third and subsequent input operations. it can.

 また、制御部110は、キャラクタCが攻撃動作のアクションを開始してからアクションを終了する前のアクション実行期間において、ステップS103以降の処理を並行して実行してもよい。例えば、オブジェクト制御部114がステップS203を実行している期間の少なくとも一部において、入力操作受付部111がフリック操作を受け付けた場合、制御部110は、ステップS103以降の処理を実行する。そして、制御部110は、コンボデータ122を参照し、当該フリック操作に応じたアクションを記憶部120に記憶させる。オブジェクト制御部114は、ステップS203の処理を終えると、当該アクションを記憶部120から読み出して、キャラクタCに実行させる。この構成であっても、オブジェクト制御部114は、キャラクタCにコンボを実行させることができる。また、制御部110は、キャラクタCにコンボを実行させることで、コンボ数をタッチスクリーン15に表示することとしてもよい。制御部110は、キャラクタCがアクションを連続して実行している回数を、アクションを行う都度カウントアップし、カウント値をコンボ数としてタッチスクリーン15に表示することとしてもよい。 Further, the control unit 110 may execute the processes after step S103 in parallel in the action execution period after the character C starts the action of attack action and before the action is ended. For example, when the input operation reception unit 111 receives a flick operation during at least a part of the period in which the object control unit 114 executes step S203, the control unit 110 executes the processing after step S103. Then, the control unit 110 refers to the combo data 122 and causes the storage unit 120 to store an action corresponding to the flick operation. After completing the process of step S203, the object control unit 114 reads the action from the storage unit 120 and causes the character C to execute it. Even with this configuration, the object control unit 114 can cause the character C to perform a combo. Further, the control unit 110 may display the number of combos on the touch screen 15 by causing the character C to perform combos. The control unit 110 may count up the number of times that the character C continuously executes the action each time the action is performed, and display the count value as the number of combos on the touch screen 15.

 <第2の形態>
 図8は、ゲームシステム1に含まれるサーバ200およびユーザ端末100の機能的構成を示すブロック図である。サーバ200およびユーザ端末100のそれぞれが備えている、一般的なコンピュータとして機能する場合に必要な機能的構成、および、ゲームにおける公知の機能を実現するために必要な機能的構成については、適宜省略している。
<Second form>
FIG. 8 is a block diagram showing functional configurations of the server 200 and the user terminal 100 included in the game system 1. The functional configuration required when the server 200 and the user terminal 100 each function as a general computer and the functional configuration necessary for realizing known functions in the game are omitted as appropriate. is doing.

 ユーザ端末100は、ユーザの入力操作を受け付ける入力装置としての機能と、ゲームの画像や音声を出力する出力装置としての機能を有する。ユーザ端末100は、プロセッサ10、メモリ11、ストレージ12、通信IF13、および入出力IF14等の協働によって、制御部110および記憶部120として機能する。 The user terminal 100 has a function as an input device that receives user input operations and a function as an output device that outputs game images and sounds. The user terminal 100 functions as the control unit 110 and the storage unit 120 through cooperation of the processor 10, the memory 11, the storage 12, the communication IF 13, the input / output IF 14, and the like.

 サーバ200は、各ユーザ端末100と通信して、ユーザ端末100がゲームを進行させるのを支援する機能を有する。例えば、有価データの販売、サービスの提供などを実行する。ゲームがマルチプレイゲームである場合には、サーバ200は、ゲームに参加する各ユーザ端末100と通信して、ユーザ端末100同士のやりとりを仲介する機能を有していてもよい。サーバ200は、プロセッサ20、メモリ21、ストレージ22、通信IF23、および入出力IF24等の協働によって、制御部210および記憶部220として機能する。 The server 200 communicates with each user terminal 100 and has a function of supporting the user terminal 100 to advance the game. For example, sales of valuable data and provision of services are executed. When the game is a multiplayer game, the server 200 may have a function of communicating with each user terminal 100 participating in the game and mediating exchanges between the user terminals 100. The server 200 functions as the control unit 210 and the storage unit 220 through the cooperation of the processor 20, the memory 21, the storage 22, the communication IF 23, the input / output IF 24, and the like.

 記憶部120-1および記憶部220は、ゲームプログラム131、ゲーム情報132およびユーザ情報133を格納する。ゲームプログラム131は、ユーザ端末100およびサーバ200で実行するゲームプログラムである。 The storage unit 120-1 and the storage unit 220 store a game program 131, game information 132, and user information 133. The game program 131 is a game program that is executed by the user terminal 100 and the server 200.

 ゲーム情報132は、制御部110-1および制御部210がゲームプログラム131を実行する際に参照するデータである。ゲーム情報132は、複数のユーザに共通する情報、例えば、(1)ゲーム空間を規定するための情報、(2)各キャラクタに関する基本的なパラメータ、(3)各アクションに関する基本的なパラメータ、(4)後述する編集画面において提示する初期情報やお勧め情報等を含んでいてもよい。なお、ゲーム空間とは、操作キャラクタと、ゲームに関わる各種オブジェクトとが配置される空間である。その他、ゲーム情報132は、ゲーム空間において実施される各種イベントに関する情報を含んでいてもよい。 The game information 132 is data referred to when the control unit 110-1 and the control unit 210 execute the game program 131. The game information 132 is information common to a plurality of users, for example, (1) information for defining the game space, (2) basic parameters for each character, (3) basic parameters for each action, ( 4) It may include initial information, recommended information, and the like to be presented on an editing screen described later. The game space is a space where operation characters and various objects related to the game are arranged. In addition, the game information 132 may include information related to various events performed in the game space.

 ユーザ情報133は、ユーザのアカウントに関するデータである。例えば、ユーザ情報133は、ユーザのアカウントの識別子に関連付けて、(1)当該アカウントのユーザを示す情報、(2)当該アカウントが保有する保有キャラクタに関する情報(3)各保有キャラクタが取得済みのアクションに関する情報、(4)当該アカウントのゲームの進行度合いを表す情報、(5)当該アカウントが保有する資産に関する情報等を含み得る。アカウントの資産としては、例えば、ゲーム内の仮想通貨、アイテム、装備品などが挙げられる。その他、ユーザ情報133は、アカウントごとに管理される各種の情報を含む。 User information 133 is data related to the user's account. For example, the user information 133 is associated with an identifier of the user's account, (1) information indicating the user of the account, (2) information regarding the retained character held by the account, and (3) an action for which each retained character has been acquired. (4) information indicating the progress of the game of the account, (5) information regarding assets held by the account, and the like. Examples of the account assets include virtual currency, items, and equipment in the game. In addition, the user information 133 includes various information managed for each account.

 なお、記憶部220において、ユーザ情報133は、ユーザ端末100ごとに格納されている。 In the storage unit 220, user information 133 is stored for each user terminal 100.

  (サーバ200の機能的構成)
 制御部210は、記憶部220に格納されたゲームプログラム131を実行することにより、サーバ200を統括的に制御する。例えば、制御部210は、ユーザ端末100に各種データおよびプログラム等を送信する。制御部210は、ゲーム情報もしくはユーザ情報の一部または全部をユーザ端末100から受信する。ゲームがマルチプレイゲームである場合には、制御部210は、ユーザ端末100からマルチプレイの同期の要求を受信して、同期のためのデータをユーザ端末100に送信してもよい。その他、制御部210は、ユーザ端末100におけるゲームの進行を支援するため、実行するゲームの性質に応じた各種の機能を有する。
(Functional configuration of server 200)
The control unit 210 performs overall control of the server 200 by executing the game program 131 stored in the storage unit 220. For example, the control unit 210 transmits various data and programs to the user terminal 100. The control unit 210 receives part or all of the game information or user information from the user terminal 100. When the game is a multiplayer game, the control unit 210 may receive a multiplayer synchronization request from the user terminal 100 and transmit data for synchronization to the user terminal 100. In addition, the control unit 210 has various functions according to the nature of the game to be executed in order to support the progress of the game on the user terminal 100.

  (ユーザ端末100の機能的構成)
 制御部110-1は、記憶部120-1に格納されたゲームプログラム131を実行することにより、ユーザ端末100を統括的に制御する。例えば、制御部110-1は、ゲームプログラム131およびユーザの操作にしたがって、ゲームを進行させる。また、制御部110-1は、ゲームを進行させている間、必要に応じて、サーバ200と通信して、情報の送受信を行う。
(Functional configuration of user terminal 100)
The control unit 110-1 performs overall control of the user terminal 100 by executing the game program 131 stored in the storage unit 120-1. For example, the control unit 110-1 advances the game according to the game program 131 and the user's operation. In addition, the control unit 110-1 communicates with the server 200 and transmits / receives information as necessary while the game is in progress.

 制御部110-1は、ゲームプログラム131の記述に応じて、操作受付部111-1、表示制御部112-1、ユーザインターフェース(以下、UI)制御部113-1、アニメーション生成部114-1、ゲーム実行部115、カメラ配置制御部116、編集画面生成部117及び連続操作判定部118として機能する。制御部110-1は、実行するゲームの性質に応じて、ゲームを進行させるために、図示しないその他の機能ブロックとしても機能することができる。 In accordance with the description of the game program 131, the control unit 110-1 includes an operation reception unit 111-1, a display control unit 112-1, a user interface (hereinafter referred to as UI) control unit 113-1, an animation generation unit 114-1, It functions as a game execution unit 115, a camera arrangement control unit 116, an edit screen generation unit 117, and a continuous operation determination unit 118. The control unit 110-1 can also function as other functional blocks (not shown) in order to advance the game according to the nature of the game to be executed.

 操作受付部111-1は、入力部151に対するユーザの入力操作を検知し受け付ける。操作受付部111-1は、タッチスクリーン15およびその他の入出力IF14を介したコンソールに対してユーザが及ぼした作用から、いかなる入力操作がなされたかを判別し、その結果を制御部110-1の各要素に出力する。 The operation accepting unit 111-1 detects and accepts a user input operation on the input unit 151. The operation accepting unit 111-1 determines what input operation has been performed from the action exerted by the user on the console via the touch screen 15 and other input / output IF 14, and the result is determined by the control unit 110-1. Output to each element.

 例えば、操作受付部111-1は、入力部151に対する入力操作を受け付け、該入力操作の入力位置の座標を検出し、該入力操作の種類を特定する。本実施形態では、入力操作は、タッチスクリーン15に対する操作であるものとする。操作受付部111-1は、タッチスクリーン15に対する操作の種類として、例えばタッチ操作、スライド操作、スワイプ操作、およびタップ操作等を特定する。また、操作受付部111-1は、連続して検知されていた入力が途切れると、タッチスクリーン15から接触入力が解除されたことを検知する。本実施形態において、操作受付部111-1により特定される操作の種類の詳細については、後述する。 For example, the operation receiving unit 111-1 receives an input operation on the input unit 151, detects the coordinates of the input position of the input operation, and identifies the type of the input operation. In the present embodiment, it is assumed that the input operation is an operation on the touch screen 15. The operation reception unit 111-1 identifies, for example, a touch operation, a slide operation, a swipe operation, a tap operation, and the like as the types of operations on the touch screen 15. Further, the operation receiving unit 111-1 detects that the touch input is canceled from the touch screen 15 when the input that has been continuously detected is interrupted. In the present embodiment, details of the types of operations specified by the operation reception unit 111-1 will be described later.

 UI制御部113-1は、UIを構築するために表示部152に表示させるUIオブジェクトを制御する。UIオブジェクトは、ユーザが、ゲームの進行上必要な入力をユーザ端末100に対して行うためのツール、または、ゲームの進行中に出力される情報をユーザ端末100から得るためのツールである。UIオブジェクトは、これには限定されないが、例えば、アイコン、ボタン、リスト、メニュー画面などである。 The UI control unit 113-1 controls UI objects to be displayed on the display unit 152 in order to construct a UI. The UI object is a tool for the user to make an input necessary for the progress of the game to the user terminal 100 or a tool for obtaining information output during the progress of the game from the user terminal 100. The UI object is not limited to this, but is, for example, an icon, a button, a list, a menu screen, or the like.

 ゲーム実行部115は、ユーザの操作に基づくアクションを、ゲーム空間に存在する操作キャラクタに実行させることにより、ゲームを進行させる。本実施形態では、ゲームは、操作キャラクタが敵キャラクタと戦闘することにより進行する。操作キャラクタは、敵キャラクタとの戦闘においてアクションを実行して敵キャラクタの体力を減らし、敵キャラクタの体力が無くなると戦闘に勝利するものとする。操作キャラクタに実行させるアクションの内容は、後述の連続操作判定部118によって決定される。アクションの種類としては、例えば、敵キャラクタに対して作用を与えるアクション、操作キャラクタ自身の移動、他の保有キャラクタのゲーム空間への召喚等があるが、これらに限られない。また、敵キャラクタに対して作用を与えるアクションとしては、敵キャラクタに対する攻撃、敵キャラクタの属性を変化させるアクション等があるが、これらに限られない。敵キャラクタに対する攻撃としては、敵キャラクタを所定距離ふきとばす技、よろけさせて所定期間行動不能にさせる技等があるが、これに限られない。また、変化させる敵キャラクタの属性としては、例えば、攻撃力、防御力、命中率、移動速度等が挙げられるが、これらに限られない。 The game execution unit 115 advances the game by causing an operation character existing in the game space to execute an action based on the user's operation. In the present embodiment, the game progresses when the operation character battles with the enemy character. It is assumed that the operation character executes an action in a battle with the enemy character to reduce the physical strength of the enemy character, and wins the battle when the physical strength of the enemy character is lost. The content of the action to be executed by the operation character is determined by the continuous operation determination unit 118 described later. Examples of the action type include, but are not limited to, an action that acts on an enemy character, a movement of the operation character itself, a summoning of other possessed characters to the game space, and the like. Moreover, examples of the action that acts on the enemy character include, but are not limited to, an attack on the enemy character and an action that changes the attribute of the enemy character. Examples of the attack on the enemy character include a technique of blowing the enemy character for a predetermined distance and a technique of making the player character move for a predetermined period of time, but are not limited thereto. Further, examples of the attribute of the enemy character to be changed include, but are not limited to, attack power, defense power, accuracy, moving speed, and the like.

 また、ゲーム実行部115は、決定したアクションを、後述のアニメーション生成部114-1に通知する。また、ゲーム実行部115は、操作キャラクタの移動に伴い、ゲーム空間における操作キャラクタの位置または向いている方向に変化があると、当該位置または方向を、後述のカメラ配置制御部116に通知する。 Also, the game execution unit 115 notifies the determined action to the animation generation unit 114-1, which will be described later. In addition, when the operation character moves, the game execution unit 115 notifies the later-described camera arrangement control unit 116 of the position or direction when there is a change in the position or direction of the operation character in the game space.

 また、ゲーム実行部115は、ゲームを進行させるための各種の処理を、必要に応じてサーバ200との通信を行いながら実行する。 In addition, the game execution unit 115 executes various processes for advancing the game while communicating with the server 200 as necessary.

 カメラ配置制御部116は、ゲーム空間のうちユーザに提示する領域を指定するための仮想カメラを規定する。カメラ配置制御部116は、仮想カメラのゲーム空間内での位置および向きを規定することにより、仮想カメラをゲーム空間に仮想的に配置する。さらに、カメラ配置制御部116は、仮想カメラで規定される視野領域および当該視野領域に配置されているオブジェクトを描画した画像を作成するよう、後述の表示制御部112-1に指示する。 The camera arrangement control unit 116 defines a virtual camera for designating an area to be presented to the user in the game space. The camera placement control unit 116 virtually places the virtual camera in the game space by defining the position and orientation of the virtual camera in the game space. Furthermore, the camera arrangement control unit 116 instructs the display control unit 112-1 described later to create an image in which a field of view defined by the virtual camera and an object arranged in the field of view are drawn.

 具体的には、例えば、カメラ配置制御部116は、ゲーム実行部115から通知される操作キャラクタの位置に基づいて、仮想カメラを操作キャラクタの後方に配置してもよい。この場合、カメラ配置制御部116は、仮想カメラの向きが、ゲーム実行部115から通知される操作キャラクタの向いている方向となるようにする。このように仮想カメラを制御することにより、ユーザは、操作キャラクタを移動させながら、操作キャラクタの視点でゲーム空間に臨むことができる。ただし、仮想カメラの配置及び向きの制御は、必ずしも上述した制御でなくてもよい。 Specifically, for example, the camera arrangement control unit 116 may arrange the virtual camera behind the operation character based on the position of the operation character notified from the game execution unit 115. In this case, the camera arrangement control unit 116 causes the orientation of the virtual camera to be the direction in which the operation character notified from the game execution unit 115 is facing. By controlling the virtual camera in this way, the user can face the game space from the viewpoint of the operating character while moving the operating character. However, the control of the placement and orientation of the virtual camera is not necessarily the control described above.

 編集画面生成部117は、連続する操作よりなる操作列に含まれる各操作に対して、該操作の操作列における順序に応じたアクションを関連付ける編集画面を生成する。なお、本実施形態では、1つの操作列は、連続する同種類の操作の列であるものとする。以降、操作列に含まれる各操作に対して順序に応じたアクションが関連付けられた情報を、操作列情報とも記載する。また、各操作に対して順序に応じたアクションを関連付けることを、操作列情報を編集する、とも記載する。編集画面生成部117の詳細については後述する。 The edit screen generation unit 117 generates an edit screen for associating actions corresponding to the order in the operation sequence of the operations with respect to each operation included in the operation sequence including the continuous operations. In the present embodiment, it is assumed that one operation sequence is a sequence of continuous operations of the same type. Hereinafter, information in which an action corresponding to an order is associated with each operation included in the operation sequence is also referred to as operation sequence information. In addition, associating an action corresponding to an order with each operation is also referred to as editing operation sequence information. Details of the edit screen generation unit 117 will be described later.

 連続操作判定部118は、タッチスクリーン15に対する操作が前回の操作に続いているか否かを判定する。また、連続操作判定部118は、前回の操作に今回の操作が続いていると判定した場合には、操作列情報を参照することにより、今回の操作に応じて操作キャラクタに実行させるアクションを決定する。連続操作判定部118の詳細については後述する。 The continuous operation determination unit 118 determines whether or not the operation on the touch screen 15 is continued from the previous operation. In addition, when the continuous operation determination unit 118 determines that the current operation is continued from the previous operation, the action to be executed by the operation character is determined according to the current operation by referring to the operation sequence information. To do. Details of the continuous operation determination unit 118 will be described later.

 アニメーション生成部114-1は、各種オブジェクトの制御態様に基づいて、各種オブジェクトのモーションを示すアニメーションを生成する。例えば、アニメーション生成部114-1は、操作キャラクタがアクションを実行する様子を表現したアニメーションを生成する。もし、アクションが敵キャラクタに対する攻撃であれば、アニメーション生成部114-1は、操作キャラクタの攻撃前の準備モーションを表すアニメーションと、攻撃モーションを表すアニメーションと、攻撃後の戻りモーションを表すアニメーションとを生成する。また、例えば、アニメーション生成部114-1は、操作キャラクタのアクションにより敵キャラクタが作用を受けた様子(やられモーション)を表すアニメーションを生成する。 The animation generation unit 114-1 generates an animation indicating the motion of various objects based on the control mode of the various objects. For example, the animation generation unit 114-1 generates an animation that expresses how the operating character performs an action. If the action is an attack on an enemy character, the animation generation unit 114-1 performs an animation that represents the preparatory motion of the operation character before the attack, an animation that represents the attack motion, and an animation that represents the return motion after the attack. Generate. In addition, for example, the animation generation unit 114-1 generates an animation representing a state in which the enemy character is acted upon by the action of the operation character (a beat motion).

 表示制御部112-1は、タッチスクリーン15の表示部152に対して、上述の各要素によって実行された処理結果が反映されたゲーム画面を出力する。例えば、表示制御部112-1は、ゲーム空間のうち、カメラ配置制御部116が規定する仮想カメラの視野の領域と、当該領域に存在するオブジェクトとを描画したゲーム画面を生成する。また、表示制御部112-1は、アニメーション生成部114-1によって生成されたアニメーションを含むゲーム画面を出力する。また、表示制御部112-1は、上述のUIオブジェクトを、該ゲーム画面に重畳して描画してもよい。 The display control unit 112-1 outputs a game screen reflecting the processing results executed by the above-described elements to the display unit 152 of the touch screen 15. For example, the display control unit 112-1 generates a game screen in which a field of view of the virtual camera defined by the camera placement control unit 116 and objects existing in the region are drawn in the game space. Further, the display control unit 112-1 outputs a game screen including the animation generated by the animation generation unit 114-1. In addition, the display control unit 112-1 may draw the UI object described above by superimposing it on the game screen.

 なお、図8に示すサーバ200およびユーザ端末100の機能は一例にすぎない。サーバ200は、ユーザ端末100が備える機能の少なくとも一部を備えていてもよい。また、ユーザ端末100は、サーバ200が備える機能の少なくとも一部を備えていてもよい。さらに、ユーザ端末100およびサーバ200以外の他の装置をゲームシステム1の構成要素とし、該他の装置にゲームシステム1における処理の一部を実行させてもよい。すなわち、本実施形態においてゲームプログラムを実行するコンピュータは、ユーザ端末100、サーバ200、および他の装置の何れであってもよいし、これらの複数の装置の組み合わせにより実現されてもよい。 Note that the functions of the server 200 and the user terminal 100 shown in FIG. 8 are merely examples. The server 200 may include at least a part of functions included in the user terminal 100. Further, the user terminal 100 may include at least a part of the functions included in the server 200. Furthermore, a device other than the user terminal 100 and the server 200 may be used as a component of the game system 1, and the other device may be caused to execute part of the processing in the game system 1. That is, the computer that executes the game program in the present embodiment may be any of the user terminal 100, the server 200, and other devices, or may be realized by a combination of these devices.

  (操作の種類)
 操作受付部111-1により特定される操作の種類の詳細について説明する。本実施形態では、操作の種類として、タップ操作、上フリック操作、下フリック操作、左右フリック操作及び回転操作があるものとする。タップ操作の検出手法については、公知の手法を適用可能である。
(Type of operation)
Details of the types of operations specified by the operation receiving unit 111-1 will be described. In the present embodiment, it is assumed that the operation types include a tap operation, an upper flick operation, a lower flick operation, a left / right flick operation, and a rotation operation. A known method can be applied to the tap operation detection method.

 ここでは、まず、上フリック操作、下フリック操作及び左右フリック操作の検出手法について説明する。これらの操作は、それぞれ、フリック操作の方向に基づき検出される。フリック操作自体の検出手法については、公知の手法を適用可能である。ここで、上、下、左右の各方向は、表示部152において任意に定めることが可能である。例えば、表示部152に表示されたゲーム空間においてキャラクタが向いている方向(すなわち、仮想カメラが向いている方向)を、上方向としてもよい。この場合、上方向を定めることにより、上方向の反対方向を、下方向と定めることができる。また、上方向を90度左(または右)回転させた方向を、左(または右)方向と定めることができる。 Here, first, a detection method of an upper flick operation, a lower flick operation, and a left / right flick operation will be described. Each of these operations is detected based on the direction of the flick operation. As a detection method of the flick operation itself, a known method can be applied. Here, the up, down, left and right directions can be arbitrarily determined on the display unit 152. For example, the direction in which the character is facing in the game space displayed on the display unit 152 (that is, the direction in which the virtual camera is facing) may be the upward direction. In this case, by defining the upward direction, the opposite direction of the upward direction can be defined as the downward direction. Further, a direction obtained by rotating the upward direction 90 degrees to the left (or right) can be determined as the left (or right) direction.

 この場合、操作受付部111-1は、フリック操作の方向が、上方向を基準として所定角度の範囲に含まれる場合に、上フリック操作として検出する。これにより、操作キャラクタが向いている方向へのフリック操作が、上フリック操作として検出される。また、操作受付部111-1は、フリック操作の方向が、下方向を基準として所定角度の範囲に含まれる場合に、下フリック操作として検出する。これにより、操作キャラクタの後ろに向かう方向へのフリック操作が、下フリック操作として検出される。また、操作受付部111-1は、フリック操作の方向が、左(または右)方向を基準として所定角度の範囲に含まれる場合に、左右フリック操作として検出する。これにより、操作キャラクタの横方向に向かうフリック操作が、左右フリック操作として検出される。 In this case, the operation receiving unit 111-1 detects the flick operation as an upper flick operation when the direction of the flick operation is included in a range of a predetermined angle with respect to the upper direction. Thereby, a flick operation in the direction in which the operation character is facing is detected as an upper flick operation. Further, the operation reception unit 111-1 detects a flick operation as a lower flick operation when the direction of the flick operation is included in a range of a predetermined angle with respect to the lower direction. Thereby, the flick operation toward the back of the operation character is detected as the lower flick operation. Further, the operation reception unit 111-1 detects a flick operation as a left / right flick operation when the direction of the flick operation is included in a predetermined angle range with respect to the left (or right) direction. As a result, the flick operation toward the horizontal direction of the operation character is detected as a left / right flick operation.

 次に、回転操作の検出手法について説明する。回転操作とは、ドラッグ操作における物体1010の接触位置(すなわち、タッチ位置)の軌跡がリング状またはほぼリング状となる操作をいうものとする。なお、回転操作の軌跡が描くリングは閉じていなくてもよい。また、回転操作におけるドラッグ操作の開始位置と終了位置とは、一致していなくてもよい。 Next, a rotation operation detection method will be described. The rotation operation is an operation in which the locus of the contact position (that is, the touch position) of the object 1010 in the drag operation becomes a ring shape or a substantially ring shape. The ring drawn by the locus of the rotation operation does not have to be closed. In addition, the start position and the end position of the drag operation in the rotation operation do not have to match.

 回転操作の検出手法の一例について説明する。例えば、回転操作は、所定期間におけるドラッグ操作のベクトル成分の変化を検知することにより特定可能である。具体的には、操作受付部111-1は、入力部151に対する物体1010の接触を検出してから所定期間ドラッグ操作が継続していれば、その操作方向のベクトル成分の変化を検出する。操作方向のベクトル成分の変化について、図9を用いて説明する。図9は、回転操作における操作方向のベクトル成分の変化を表す模式図である。なお、ここでは、操作方向を、タッチスクリーン15の平面上に任意の直交座標系を定めたときのx軸に沿うx成分及びy軸に沿うy成分の組で表すとする。また、正のx成分を、「x軸正方向」と記載する。また、負のx成分を、「x軸負方向」と記載する。また、正のy成分を、「y軸正方向」と記載する。また、負のy成分を、「y軸負方向」と記載する。 An example of a rotation operation detection method will be described. For example, the rotation operation can be specified by detecting a change in the vector component of the drag operation during a predetermined period. Specifically, if the drag operation continues for a predetermined period after detecting the contact of the object 1010 with the input unit 151, the operation receiving unit 111-1 detects a change in the vector component in the operation direction. The change in the vector component in the operation direction will be described with reference to FIG. FIG. 9 is a schematic diagram illustrating a change in the vector component in the operation direction in the rotation operation. Here, the operation direction is represented by a set of an x component along the x axis and a y component along the y axis when an arbitrary orthogonal coordinate system is defined on the plane of the touch screen 15. The positive x component is described as “x axis positive direction”. The negative x component is described as “x-axis negative direction”. The positive y component is referred to as “y-axis positive direction”. Further, the negative y component is described as “y-axis negative direction”.

 例えば、図9(A)に示すように、所定期間中に、操作方向のベクトル成分が、(1)(x軸負方向、y軸正方向)、(2)(x軸正方向、y軸正方向)、(3)(x軸正方向、y軸負方向)、(4)(x軸負方向、y軸負方向)の順で変化したとする。この場合、操作受付部111-1は、回転操作を検出したと判断する。この場合の回転操作は、時計回りの操作である。また、例えば、図9(B)に示すように、所定期間中に、操作方向のベクトル成分が、(1)(x軸正方向、y軸正方向)、(2)(x軸負方向、y軸正方向)、(3)(x軸負方向、y軸負方向)、(4)(x軸正方向、y軸負方向)の順で変化したとする。この場合、操作受付部111-1は、回転操作を検出したと判断する。なお、この場合の回転操作は、半時計回りの操作である。このように、操作受付部111-1は、回転操作を表すベクトル成分の変化のパターンをあらかじめ記憶しておき、該当するパターンを検出した場合に、回転操作を検出したと判断すればよい。なお、該当する変化のパターンは、上述した例に限定されない。また、回転操作の検出手法は、上述した手法に限られず、他の手法を適用することも可能である。 For example, as shown in FIG. 9A, during a predetermined period, the vector component in the operation direction is (1) (x-axis negative direction, y-axis positive direction), (2) (x-axis positive direction, y-axis (Positive direction), (3) (x-axis positive direction, y-axis negative direction), (4) (x-axis negative direction, y-axis negative direction). In this case, the operation reception unit 111-1 determines that a rotation operation has been detected. The rotation operation in this case is a clockwise operation. Also, for example, as shown in FIG. 9B, during a predetermined period, the vector component in the operation direction is (1) (x-axis positive direction, y-axis positive direction), (2) (x-axis negative direction, The y-axis positive direction), (3) (x-axis negative direction, y-axis negative direction), and (4) (x-axis positive direction, y-axis negative direction) are assumed to change in this order. In this case, the operation reception unit 111-1 determines that a rotation operation has been detected. In this case, the rotation operation is a counterclockwise operation. As described above, the operation accepting unit 111-1 may store a vector component change pattern representing a rotation operation in advance and determine that the rotation operation has been detected when a corresponding pattern is detected. Note that the corresponding change pattern is not limited to the above-described example. Further, the detection method of the rotation operation is not limited to the above-described method, and other methods can be applied.

  (編集画面生成部117の詳細)
 編集画面生成部117が生成する編集画面について説明する。編集画面は、操作列情報を編集するための画面である。なお、以下において、同種類の操作よりなる操作列の先頭からi(i=1,2,・・・・)番目の操作を、順序iの操作とも記載する。また、順序iの操作に関連付けられたアクションを、i段目のアクションとも記載する。操作列情報は、詳細を後述する連続操作判定部118によって操作が連続していると判定されたときに参照される。ここで、ある種類の操作に続いて他の種類の操作がある場合には、ある種類の操作列情報に続いて他の種類の操作列情報が参照される。このため、ユーザは、各種類の操作列情報の組み合わせを考慮して編集を行うことで、連続する操作により連続して繰り出されるアクションを、より変化に富んだものとすることができる。
(Details of edit screen generation unit 117)
The edit screen generated by the edit screen generation unit 117 will be described. The edit screen is a screen for editing operation sequence information. In the following, the i-th (i = 1, 2,...) -Th operation from the beginning of the operation sequence composed of the same type of operation is also referred to as an operation of order i. The action associated with the operation of order i is also referred to as the i-th action. The operation sequence information is referred to when the continuous operation determination unit 118, which will be described in detail later, determines that the operation is continuous. Here, when there is another type of operation following a certain type of operation, another type of operation sequence information is referred to following the certain type of operation sequence information. For this reason, the user can edit the actions in consideration of the combination of each type of operation sequence information, so that the actions that are continuously fed out by successive operations can be more varied.

 編集画面生成部117は、ユーザが保有する保有キャラクタ毎に、編集画面を生成する。保有キャラクタは、ユーザによって選択されることにより操作キャラクタとなり得るキャラクタである。保有キャラクタとしては、ゲームの開始時に予めユーザに付与されているキャラクタがある。また、保有キャラクタとしては、ゲームの進行に伴い追加して付与されるキャラクタがある。例えば、保有キャラクタは、操作キャラクタが敵キャラクタとの戦闘に勝利することにより付与されてもよい。また、追加して付与される保有キャラクタは、1つ以上のキャラクタの中から抽選により決定されてもよい。また、追加して付与される保有キャラクタは、ゲームの進行に伴いユーザに付与される未開封物が開封されることにより、その内包物として出現してもよい。この場合、未開封物は、所定の条件を満たすことにより開封されてもよい。例えば、未開封物が、操作キャラクタが戦闘に勝利することにより付与されるタマゴとして表現されるとする。この場合、このようなタマゴが、仮想的なふ化装置にセットされ、所定条件が満たされるとふ化してキャラクタが出現するものとしてもよい。なお、未開封物を開封するための所定条件が満たされるまでの期間は、仮想通貨等の消費アイテムとの引き換えに短縮されるようになっていてもよい。また、所定条件を満たす代わりに、消費アイテムとの引き換えに未開封物が開封されてもよい。 The edit screen generation unit 117 generates an edit screen for each retained character held by the user. The possessed character is a character that can be an operation character when selected by the user. As the possessed character, there is a character given to the user in advance at the start of the game. Further, as the possessed character, there is a character that is additionally given as the game progresses. For example, the possessed character may be given when the operation character wins a battle with the enemy character. Moreover, the possessed character to be additionally given may be determined by lottery from one or more characters. In addition, the possessed character that is additionally given may appear as an inclusion by opening an unopened thing that is given to the user as the game progresses. In this case, the unopened object may be opened by satisfying a predetermined condition. For example, suppose that an unopened thing is expressed as an egg provided when an operation character wins a battle. In this case, such an egg may be set in a virtual hatching device and hatched when a predetermined condition is satisfied, so that a character appears. In addition, the period until the predetermined condition for opening an unopened thing is satisfy | filled may be shortened in exchange for consumption items, such as virtual currency. Moreover, an unopened thing may be opened instead of satisfy | filling predetermined conditions in exchange for a consumption item.

 編集画面の一例を図10(A)及び(B)に示す。図10(A)及び(B)において、編集画面は、操作の種類毎の操作列コンポーネント115aと、アクションアイコン115bの一覧とを含む。この例では、5種類の操作(タップ操作、上フリック操作、下フリック操作、左右フリック操作及び回転操作)のそれぞれの操作列コンポーネント115aが含まれている。ただし、編集画面に含まれる操作列コンポーネント115aの個数は、図示した個数に限定されない。また、表示し得るアクションアイコン115bの個数が、同時に表示可能な個数(図10(A)及び(B)では6つ)を超える場合には、スクロールバー115cが表示されるようになっている。スクロールバー115cに対する操作により、表示し得るアクションアイコン115bのうち、隠れていたアクションアイコン115bが表示される。 An example of the edit screen is shown in FIGS. 10 (A) and 10 (B). 10A and 10B, the editing screen includes an operation sequence component 115a for each type of operation and a list of action icons 115b. In this example, operation sequence components 115a of five types of operations (tap operation, upper flick operation, lower flick operation, left and right flick operation, and rotation operation) are included. However, the number of operation sequence components 115a included in the editing screen is not limited to the illustrated number. If the number of action icons 115b that can be displayed exceeds the number that can be displayed simultaneously (six in FIGS. 10A and 10B), the scroll bar 115c is displayed. Of the action icons 115b that can be displayed, the hidden action icon 115b is displayed by an operation on the scroll bar 115c.

 まず、操作列コンポーネント115aについて説明する。操作列コンポーネント115aは、該当する種類の連続した操作のうち、順序iの操作に関連付けられたアクションを表すスロット115ai(iは1以上N以下の自然数)を、1段目からN段目まで順に配列したコンポーネントである。なお、Nは、1以上の整数であり、当該種類の操作列に含まれ得る操作の最大数を表す。最大数Nは、当該種類の連続する操作により連続してアクションを繰り出すことが可能な連続数の上限値となる。以降、最大数Nを、最大連続数Nとも記載する。最大連続数Nは、例えば、該当する保有キャラクタの特性に応じて異なっていてもよい。また、最大連続数Nは、該当する操作の種類の特性に応じて異なっていてもよい。また、最大連続数Nは、所定の条件が満たされた場合に増加可能である。最大連続数Nを増加させる条件としては、例えば、仮想通貨等の消費アイテムとの引き換えが挙げられる。また、最大連続数Nを増加させる条件としては、例えば、その保有キャラクタがレベルアップすることが挙げられる。ただし、最大連続数Nを増加させる条件は、これらに限られない。 First, the operation sequence component 115a will be described. The operation sequence component 115a sequentially assigns slots 115ai (i is a natural number from 1 to N) representing the actions associated with the operation of the order i among the corresponding types of consecutive operations from the first stage to the Nth stage. It is an arranged component. N is an integer equal to or greater than 1, and represents the maximum number of operations that can be included in the type of operation sequence. The maximum number N is an upper limit value of the continuous number that allows the action to be continuously sent out by the continuous operation of the type. Hereinafter, the maximum number N is also referred to as the maximum continuous number N. The maximum continuation number N may be different depending on the characteristics of the corresponding possessed character, for example. Further, the maximum continuous number N may be different depending on the characteristics of the type of the corresponding operation. Further, the maximum number N of consecutive times can be increased when a predetermined condition is satisfied. An example of the condition for increasing the maximum number N is exchange for a consumption item such as virtual currency. In addition, as a condition for increasing the maximum number of consecutive N, for example, the possessed character may be leveled up. However, the conditions for increasing the maximum continuous number N are not limited to these.

 また、操作列コンポーネント115aに含まれるN個のスロット115aiのうち、編集可能数Mまでのスロット115aiに対して、アクションの関連付けが可能(すなわち、編集可能)である。編集可能数Mを超えるスロット115aiは、編集不可能であり、アクションの関連付けができない。編集不可能なスロット115aiは、編集不可能であることを表すよう表示される。図10(A)及び(B)の例では、カギマークが表示されたスロット115aiは、編集不可能であることを表している。また、当該種類の連続する操作により連続してアクションを繰り出すことが可能な連続数は、編集可能数Mまでとなる。ただし、編集可能数Mは、保有キャラクタが満たす条件に応じて最大連続数Nまで増加可能である。なお、編集可能数Mは、保有キャラクタにより異なっていてもよい。編集可能数Mを増加させる条件としては、例えば、仮想通貨等の消費アイテムとの引き換えが挙げられる。また、編集可能数Mを増加させる条件としては、例えば、その保有キャラクタがレベルアップすることが挙げられる。ただし、編集可能数Mを増加させる条件は、これらに限られない。 Further, among the N slots 115ai included in the operation sequence component 115a, actions can be associated with the slots 115ai up to the editable number M (that is, editable). The slots 115ai exceeding the editable number M are not editable and cannot be associated with actions. The non-editable slot 115ai is displayed to indicate that it cannot be edited. In the examples of FIGS. 10A and 10B, the slot 115ai in which the lock mark is displayed indicates that editing is impossible. Further, the number of continuous actions that can be continuously sent out by this type of continuous operation is the editable number M. However, the editable number M can be increased up to the maximum continuous number N according to the conditions satisfied by the possessed character. The editable number M may be different depending on the possessed character. As a condition for increasing the editable number M, for example, exchange with a consumption item such as a virtual currency can be cited. In addition, as a condition for increasing the editable number M, for example, the possessed character is raised in level. However, the conditions for increasing the editable number M are not limited to these.

 次に、アクションアイコン115bについて説明する。アクションアイコン115bは、操作列情報の各操作に対して関連付け得るアクションを表している。関連付け得るアクションの種類としては、前述のように、敵キャラクタに作用を与えるもの、操作キャラクタを移動するもの、他の保有キャラクタを召喚するもの等がある。 Next, the action icon 115b will be described. The action icon 115b represents an action that can be associated with each operation in the operation sequence information. As described above, the types of actions that can be associated include those that act on enemy characters, those that move operation characters, and those that summon other possessed characters.

 また、アクションアイコン115bが表すアクションには、例えば、アクションの名称、アクションの特性、アクションのパラメータ等の各種の情報が定められている。アクションアイコン115bの近傍には、このような各アクションに関する情報が併せて表示されていてもよい。図10(A)の例では、例えば、「A1」という名称のアクションの特性は「xx」であり、コストは「yy」である。また、アクションのパラメータの一例として、コストがある。コストの詳細については後述する。なお、アクションアイコン115bが表すアクションに定められる情報は、上述した情報に限定されない。また、アクションアイコン115bの近傍に表示される情報は、これらの情報に限定されない。なお、図10(A)の例では、アクションアイコン115bは、対応するアクションの名称を含むアイコンとして表示されているが、アクションアイコン115bの表示形態は、これに限られない。例えば、アクションアイコン115bは、対応するアクションを表すデザインのアイコンとして表示されていてもよい。 In the action represented by the action icon 115b, for example, various information such as an action name, an action characteristic, and an action parameter are defined. Information regarding each of these actions may be displayed in the vicinity of the action icon 115b. In the example of FIG. 10A, for example, the characteristic of the action named “A1” is “xx”, and the cost is “yy”. An example of an action parameter is cost. Details of the cost will be described later. Note that the information defined in the action represented by the action icon 115b is not limited to the information described above. Further, information displayed in the vicinity of the action icon 115b is not limited to such information. In the example of FIG. 10A, the action icon 115b is displayed as an icon including the name of the corresponding action, but the display form of the action icon 115b is not limited to this. For example, the action icon 115b may be displayed as a design icon representing the corresponding action.

 次に、アクションアイコン115bの一覧について説明する。アクションアイコン115bの一覧は、スロット115aiに対応する操作に関連付けが可能なアクションの一覧を表す。関連付けが可能なアクションは、該当する保有キャラクタによって取得済みのアクションを含む。さらに、関連付けが可能なアクションは、該当する保有キャラクタと同種別の他の保有キャラクタによって取得済みのアクションを含んでいてもよい。 Next, a list of action icons 115b will be described. The list of action icons 115b represents a list of actions that can be associated with the operation corresponding to the slot 115ai. Actions that can be associated include actions already acquired by the corresponding possessed character. Furthermore, the actions that can be associated with each other may include actions that have already been acquired by another possessed character of the same type as the corresponding retained character.

 なお、保有キャラクタは、条件を満たすことにより新たなアクションを取得するようになっていてもよい。新たなアクションを取得する条件とは、例えば、該当する保有キャラクタの操作キャラクタとしてのプレイ期間が閾値を超えることであってもよい。これは、ゲームの進行に伴い操作キャラクタが新たな技を閃いたことを表現している。また、アクションを取得する条件とは、例えば、操作キャラクタが所定のアイテムを取得することであってもよい。これは、操作キャラクタが所定のアイテムを用いて新たな技を覚えることを表現している。 The possessed character may acquire a new action by satisfying the condition. The condition for acquiring a new action may be, for example, that the play period as the operation character of the corresponding possessed character exceeds a threshold value. This represents that the operation character flashes a new technique as the game progresses. Further, the condition for acquiring the action may be, for example, that the operation character acquires a predetermined item. This expresses that the operating character learns a new technique using a predetermined item.

 また、アクションアイコン115bの一覧は、編集対象のスロット115aiに対応する操作の種類に応じた特性を有するアクションの一覧を表す。例えば、タップ操作に応じた特性とは、ホーミング性能が高いことであってもよい。また、例えば、上フリック操作または下フリック操作に応じた特性とは、攻撃力が高いことであってもよい。また、例えば、左右フリックに応じた特性とは、攻撃範囲が広いことであってもよい。また、関連付け得るアクションの特性を、操作の種類に応じて変化させてもよい。例えば、上、下または左右フリック操作に比べて、タップ操作に関連付け得るアクションのホーミング性能を高くし、回転操作に関連付け得るアクションのホーミング性能を低くしてもよい。また、例えば、上または下フリック操作に比べて、左右フリック操作に関連付け得るアクションのホーミング性能を高くしてもよい。 The list of action icons 115b represents a list of actions having characteristics corresponding to the type of operation corresponding to the slot 115ai to be edited. For example, the characteristic corresponding to the tap operation may be high homing performance. Further, for example, the characteristic corresponding to the upper flick operation or the lower flick operation may be a high attack power. For example, the characteristic corresponding to the left / right flick may be that the attack range is wide. Moreover, you may change the characteristic of the action which can be linked | related according to the kind of operation. For example, the homing performance of the action that can be associated with the tap operation may be increased and the homing performance of the action that may be associated with the rotation operation may be decreased as compared with the up / down or left / right flick operation. Further, for example, the action homing performance that can be associated with the left / right flick operation may be higher than that of the up / down flick operation.

 例えば、図10(A)では、タップ操作の操作列コンポーネント115aに含まれる何れかのスロット115aiがタップされたことにより、タップ操作に応じた特性を有するアクションA1~A6を表すアクションアイコン115bの一覧が表示されている。また、図10(B)では、上フリック操作の操作列コンポーネント115aに含まれる何れかのスロット115aiがタップされたことにより、上フリック操作に応じた特性を有するアクションB1~B6を表すアクションアイコン115bの一覧が表示されている。 For example, in FIG. 10A, a list of action icons 115b representing actions A1 to A6 having characteristics according to the tap operation when any slot 115ai included in the operation column component 115a of the tap operation is tapped. Is displayed. In FIG. 10B, when any slot 115ai included in the operation sequence component 115a of the upper flick operation is tapped, action icons 115b representing actions B1 to B6 having characteristics corresponding to the upper flick operation. Is displayed.

 次に、編集画面における操作列情報の編集方法の一例について説明する。編集可能なスロット115aiには、アクションアイコン115bを嵌め込むことが可能となっている。嵌め込まれたアクションアイコン115bが表すアクションは、スロット115aiの順序の操作に関連付けられる。スロット115aiにアクションアイコン115bを嵌め込むための操作は、例えば、ドラッグ操作であってもよい。また、嵌め込むための操作は、スロット115ai及びアクションアイコン115bをこの順または逆順にタップ操作することであってもよい。ただし、嵌め込むための操作は、これらの操作に限定されない。 Next, an example of how to edit operation sequence information on the edit screen will be described. An action icon 115b can be fitted into the editable slot 115ai. The action represented by the inserted action icon 115b is associated with the operation in the order of the slots 115ai. The operation for fitting the action icon 115b into the slot 115ai may be, for example, a drag operation. The operation for fitting may be tapping the slot 115ai and the action icon 115b in this order or in reverse order. However, the operation for fitting is not limited to these operations.

 また、スロット115aiに嵌め込まれたアクションアイコン115bは、所定の操作によりクリアされ、空きスロットとなることが可能である。クリアのための操作は、例えば、既に嵌め込まれたアクションアイコン115bが、スロット115aiの枠外にドラッグ操作されることであってもよい。ただし、クリアのための操作は、このような操作に限定されない。また、編集画面において、各種類の操作列コンポーネント115aにおける各スロット115aiを空きスロットとする「オールクリア」の機能が提供されていてもよい。この場合、編集画面は、オールクリアを指示する操作ボタン(図示せず)を含む。 In addition, the action icon 115b inserted in the slot 115ai can be cleared by a predetermined operation to become an empty slot. The operation for clearing may be, for example, that the action icon 115b already fitted is dragged outside the frame of the slot 115ai. However, the clearing operation is not limited to such an operation. In the editing screen, an “all clear” function may be provided in which each slot 115ai in each type of operation sequence component 115a is an empty slot. In this case, the editing screen includes an operation button (not shown) that instructs all clear.

 なお、スロット115aiは、該当する順序のアクションが関連付けられていない場合には、空きスロットであることを表すよう表示される。図10(A)及び(B)の例では、斜線で塗りつぶされたスロット115aiは、空きスロットである。 Note that the slot 115ai is displayed to indicate that it is an empty slot if the actions in the corresponding order are not associated. In the example of FIGS. 10A and 10B, the slot 115ai filled with diagonal lines is an empty slot.

 例えば、図10(A)において、タップ操作に関する操作列コンポーネント115aを参照すると、5つのスロット115aiが配列されている。すなわち、最大連続数Nは5である。また、スロット115a1~115a4までが編集可能であり、そのうち、スロット115a2~115a4は空きスロットである。また、スロット115a5は、編集不可能となっている。すなわち、編集可能数Mは4である。換言すると、この例では、この保有キャラクタについては、タップ操作の最大連続数5のうち編集可能数の4段目まで、連続したアクションを実行させることが可能となっている。 For example, in FIG. 10A, referring to the operation sequence component 115a related to the tap operation, five slots 115ai are arranged. That is, the maximum continuous number N is 5. Further, slots 115a1 to 115a4 can be edited, of which slots 115a2 to 115a4 are empty slots. The slot 115a5 is not editable. That is, the editable number M is four. In other words, in this example, for this possessed character, it is possible to execute a continuous action up to the fourth editable number of the maximum number of continuous tap operations 5.

 また、各操作列情報において各操作に対して関連付けられたアクションのコストの総和は、コストの上限値を超えないよう定められている。例えば、各種類の操作列コンポーネント115aにおける各スロット115aiに嵌め込まれたアクションアイコン115bのコストの総和が上限値を超えているか否かは、操作列情報の編集を終了する際に判断される。例えば、編集画面生成部117は、編集終了を指示する操作が行われた際に、コストの総和が上限値を超えていなければ、編集内容を反映した操作列情報を、ユーザ情報133として記憶部120に保存する。もし、コストの総和が上限値を超えていれば、編集画面生成部117は、操作列情報を保存しない。この場合、編集画面生成部117は、コストの総和が上限値を超えていることを表示した上で、編集画面の表示を継続してもよい。 Also, the total cost of actions associated with each operation in each operation sequence information is determined not to exceed the upper limit of the cost. For example, whether or not the total cost of the action icons 115b fitted in the slots 115ai in each type of operation sequence component 115a exceeds the upper limit value is determined when the editing of the operation sequence information is finished. For example, when the operation for instructing the end of editing is performed, the editing screen generation unit 117 stores operation sequence information reflecting the editing content as user information 133 if the total cost does not exceed the upper limit value. Save to 120. If the total cost exceeds the upper limit value, the edit screen generation unit 117 does not save the operation sequence information. In this case, the edit screen generation unit 117 may continue displaying the edit screen after displaying that the total cost exceeds the upper limit.

  (連続操作判定部118の詳細)
 連続操作判定部118の詳細について説明する。連続操作判定部118は、前述したように、ある操作が、前回の操作に連続しているか否かを判定する。前回の操作に続いているとは、今回の操作が、前回の操作に基づく操作キャラクタの一連のモーションが開始してから一定期間内に行われたことをいう。ここで、前回の操作に基づく操作キャラクタの一連のモーションが開始してからの一定期間内として、例えば、前回の操作に基づいて操作キャラクタが攻撃のための準備モーションを行っている期間としてもよいし、戻りモーションを終えるまでの間としてもよいし、戻りモーションを終えてからも所定の時間内において操作を受け付けることとしてもよい。例えば、ある操作が、前回の操作に基づく操作キャラクタの一連のモーションが終わる前に行われたこととしてもよい。
(Details of continuous operation determination unit 118)
Details of the continuous operation determination unit 118 will be described. As described above, the continuous operation determination unit 118 determines whether a certain operation is continuous with the previous operation. “Continuing from the previous operation” means that the current operation was performed within a certain period of time after a series of motions of the operation character based on the previous operation started. Here, within a certain period after the series of motions of the operation character based on the previous operation has started, for example, a period during which the operation character is performing a preparation motion for an attack based on the previous operation may be used. However, it may be until the return motion is completed, or the operation may be accepted within a predetermined time after the return motion is completed. For example, a certain operation may be performed before a series of motions of the operation character based on the previous operation is completed.

 すなわち、連続操作判定部118は、今回の操作が、前回の操作に基づく操作キャラクタの一連のモーションが開始してから一定期間内に行われたか否かを判断すればよい。例えば、ある操作が、前回の操作に基づく操作キャラクタの攻撃前の準備モーションを表すアニメーションが行われている間に行われた場合に、連続操作判定部118は、ある操作が、前回の操作に連続していると判定してもよい。また、ある操作が、前回の操作に基づく操作キャラクタの戻りモーションが終わってから一定期間が経過するまでの間に行われた場合に、連続操作判定部118は、ある操作が、前回の操作に連続していると判定してもよい。 That is, the continuous operation determination unit 118 may determine whether or not the current operation has been performed within a certain period after a series of motions of the operation character based on the previous operation has started. For example, when an operation is performed while an animation representing a preparatory motion before the attack of the operation character based on the previous operation is being performed, the continuous operation determination unit 118 determines that the operation is the previous operation. It may be determined that it is continuous. In addition, when a certain operation is performed between the end of the return motion of the operation character based on the previous operation and the elapse of a certain period, the continuous operation determination unit 118 determines that the certain operation is the previous operation. It may be determined that it is continuous.

 また、連続操作判定部118は、連続していると判定したときに、前回及び今回の操作の種類に応じた操作列情報を参照して、実行すべきアクションを決定する。連続操作判定部118は、決定したアクションを、ゲーム実行部115に通知する。 Further, when it is determined that the continuous operation determination unit 118 is continuous, the continuous operation determination unit 118 determines an action to be executed with reference to operation sequence information according to the type of the previous operation and the current operation. The continuous operation determination unit 118 notifies the game execution unit 115 of the determined action.

 具体的に、まず、前回及び今回の操作の種類が同一であった場合について説明する。この場合、連続操作判定部118は、今回の操作が、当該種類の操作の何番目であるか(すなわち、順序i)をカウントする。順序iのカウントは、連続していると判定する度に、操作の種類を履歴として記憶しておくことで算出可能である。なお、連続操作判定部118は、操作の種類の履歴を、連続していないと判定した時点でクリアしてもよい。そして、連続操作判定部118は、今回の操作の種類に関する操作列情報を参照し、今回の順序iの操作に関連付けられたアクションを、次に実行すべきアクションとして決定する。なお、今回の順序iが、操作列情報における編集可能数Mを超えている場合について説明する。この場合、本実施形態では、連続操作判定部118は、これ以上連続した操作は不可能であるとして、次に実行すべきアクションを決定しないものとする。すなわち、同じ種類の操作は、連続数が編集可能数Mを超えた時点以降、当該操作が前回の操作に連続していないと判定されるまで、無効となる。これにより、本実施形態は、ユーザに対して、できるだけ複数の種類の操作列をつなげた操作を行うよう促すことができる。 Specifically, first, the case where the types of the previous and current operations are the same will be described. In this case, the continuous operation determination unit 118 counts the number of the operation of the current type (that is, the order i). The count of the order i can be calculated by storing the type of operation as a history every time it is determined that it is continuous. Note that the continuous operation determination unit 118 may clear the history of operation types when it is determined that the history is not continuous. Then, the continuous operation determination unit 118 refers to the operation sequence information regarding the type of the current operation, and determines the action associated with the operation of the current order i as the action to be executed next. A case where the current order i exceeds the editable number M in the operation sequence information will be described. In this case, in this embodiment, it is assumed that the continuous operation determination unit 118 does not determine an action to be executed next, assuming that no further continuous operation is possible. That is, the same type of operation becomes invalid after the continuous number exceeds the editable number M until it is determined that the operation is not continuous with the previous operation. Thereby, this embodiment can urge the user to perform an operation in which a plurality of types of operation sequences are connected as much as possible.

 ただし、連続数が編集可能数Mを超えている場合の処理は、上述したような操作を無効とする処理に限られない。例えば、この場合、連続操作判定部118は、順序を1にリセットして、操作列情報の1段目のアクションを、次に実行すべきアクションとして決定してもよい。この場合、ユーザは、同種類の操作を連続して行うと、連続数に上限なく、連続数の分だけ、次々と連続してアクションを繰り出すことが可能となる。 However, the process when the continuous number exceeds the editable number M is not limited to the process for invalidating the operation as described above. For example, in this case, the continuous operation determination unit 118 may reset the order to 1 and determine the first-stage action in the operation sequence information as an action to be executed next. In this case, if the user performs the same type of operation continuously, it is possible to send out actions successively one after another by the number of continuous numbers without any upper limit.

 次に、前回及び今回の操作の種類が同一でなかった場合について説明する。この場合、連続操作判定部118は、今回の操作の種類の関する操作列情報を参照し、順序iの操作に関連付けられた1段目のアクションを、次に実行すべきアクションとして決定する。 Next, the case where the previous and current operation types are not the same will be described. In this case, the continuous operation determining unit 118 refers to the operation sequence information regarding the type of operation this time, and determines the first-stage action associated with the operation of the order i as the action to be executed next.

 ゲーム実行部115は、連続操作判定部118によって決定されたアクションが通知されると、前回の操作に基づくアクションに続いて今回の操作に基づくアクションを操作キャラクタに実行させるよう、他の各部に通知する。 When the action determined by the continuous operation determination unit 118 is notified, the game execution unit 115 notifies the other units to cause the operation character to execute the action based on the current operation following the action based on the previous operation. To do.

 例えば、前回の操作に基づくアクション及び今回の操作に基づくアクションが共に敵キャラクタに作用を与えるアクションである場合について例示的に説明する。敵キャラクタに作用を与えるアクションが、前述のように、準備モーションと、攻撃モーションと、戻りモーションとを含んでいたとする。このとき、今回の操作が、前回の操作に基づく技モーションの表示期間中であるとする。この場合、今回の操作に基づくアクションを操作キャラクタに開始させるタイミングとして、次の2つのパターンが考えられる。 For example, the case where the action based on the previous operation and the action based on the current operation are actions that affect the enemy character will be described as an example. Assume that the action that acts on the enemy character includes a preparation motion, an attack motion, and a return motion, as described above. At this time, it is assumed that the current operation is during the technique motion display period based on the previous operation. In this case, the following two patterns can be considered as timings at which the operation character starts the action based on the current operation.

 1つ目のパターンでは、ゲーム実行部115は、操作キャラクタに、前回の操作による技モーションを終了させた後、続く戻りモーションをキャンセルして、今回の操作による準備モーションを開始させる。2つ目のパターンでは、ゲーム実行部115は、操作キャラクタに、前回の操作による技モーションをキャンセルさせ、直ちに今回の操作による準備モーションを開始させる。いずれの場合であっても、ゲーム実行部115は、操作キャラクタに、連続した操作に応じて、操作列情報に基づき決定したアクションを連続して実行させることができる。 In the first pattern, the game execution unit 115 causes the operation character to end the technique motion by the previous operation, cancels the subsequent return motion, and starts the preparation motion by the current operation. In the second pattern, the game execution unit 115 causes the operation character to cancel the technique motion by the previous operation and immediately starts the preparation motion by the current operation. In any case, the game execution unit 115 can cause the operation character to continuously execute the action determined based on the operation sequence information according to the continuous operation.

 <ゲームシステム1の処理の態様例>
 次に、ゲームシステム1の処理の態様例として、操作列情報の編集処理と、連続する操作に応じた処理とについて、図面を参照して説明する。
<Example of Processing of Game System 1>
Next, as an example of processing of the game system 1, operation sequence information editing processing and processing according to successive operations will be described with reference to the drawings.

 (操作列情報の編集処理の流れ)
 まず、操作列情報の編集処理について、図11を参照して説明する。図11は、操作列情報の編集処理の流れを示すフローチャートである。なお、以下の動作の開始時に、表示部152には、既にゲーム画面が表示されているものとする。また、表示されているゲーム画面には、操作列情報の編集開始を指示するためのメニューまたは操作ボタン等が含まれているものとする。また、以下の説明において、編集画面生成部117の処理結果を表示制御部112-1によって表示することを、単に、編集画面生成部117によって表示する、とも記載する。
(Flow of editing operation column information)
First, operation string information editing processing will be described with reference to FIG. FIG. 11 is a flowchart showing the flow of the operation sequence information editing process. It is assumed that the game screen has already been displayed on the display unit 152 at the start of the following operation. Further, it is assumed that the displayed game screen includes a menu or an operation button for instructing start of editing of operation sequence information. Further, in the following description, displaying the processing result of the editing screen generation unit 117 by the display control unit 112-1 is simply referred to as displaying by the editing screen generation unit 117.

 ステップS101-1において、操作受付部111-1は、ある保有キャラクタについて、操作列情報の編集開始を指示する操作を受け付けたか否かを判断する。対象となる保有キャラクタは、この時点で選択されている操作キャラクタであってもよいし、その他の保有キャラクタであってもよい。編集開始を指示する操作が受け付けられた場合、次のステップS102-1が実行される。 In step S 101-1, the operation reception unit 111-1 determines whether an operation for instructing to start editing operation sequence information has been received for a certain retained character. The target possessed character may be the operation character selected at this time, or may be another possessed character. When an operation for instructing the start of editing is accepted, the next step S102-1 is executed.

 ステップS102-1において、編集画面生成部117は、その保有キャラクタに関する操作列情報を記憶部120から読み込む。このとき読み込まれる操作列情報は、既に編集されて保存されている操作列情報、または、初期状態の操作列情報である。初期状態の操作列情報では、各操作に対して予め定められたアクションが関連付けられているものとする。 In step S102-1, the edit screen generation unit 117 reads operation sequence information related to the possessed character from the storage unit 120. The operation sequence information read at this time is operation sequence information that has already been edited and stored, or operation sequence information in an initial state. In the operation column information in the initial state, it is assumed that a predetermined action is associated with each operation.

 ステップS103-1において、編集画面生成部117は、この保有キャラクタの操作列情報に基づいて、編集画面を表示する。 In step S103-1, the edit screen generation unit 117 displays an edit screen based on the operation string information of the possessed character.

 ステップS104-1において、操作受付部111-1は、スロット115aiの1つに対する操作を受け付ける。例えば、スロット115aiに対するタップ操作、または、ドラッグ操作が受け付けられてもよい。以下、操作が受け付けられたスロット115aiを、操作対象のスロット115aiとも記載する。 In step S104-1, the operation reception unit 111-1 receives an operation for one of the slots 115ai. For example, a tap operation or a drag operation on the slot 115ai may be accepted. Hereinafter, the slot 115ai in which the operation has been accepted is also referred to as an operation target slot 115ai.

 ステップS105-1において、編集画面生成部117は、操作が受け付けられたスロット115aiの状態によって、処理を分岐する。 In step S105-1, the edit screen generation unit 117 branches the process depending on the state of the slot 115ai for which the operation has been accepted.

 具体的には、ステップS105-1において、操作対象のスロット115aiが編集不可能であった場合について説明する。この場合、ステップS106-1において、編集画面生成部117は、該当するスロット115aiは編集不可能であるという通知を表示する。 Specifically, the case where the operation target slot 115ai cannot be edited in step S105-1 will be described. In this case, in step S106-1, the edit screen generation unit 117 displays a notification that the corresponding slot 115ai cannot be edited.

 また、ステップS105-1において、操作対象のスロット115aiに既にアクションアイコン115bが嵌め込まれていた場合について説明する。この場合、ステップS104-1で受け付けられた操作が、当該スロット115aiに対するクリアを指示する操作であったとする(ステップS107-1でYes)。この場合、ステップS108-1において、編集画面生成部117は、該当するスロット115aiからアクションアイコン115bをクリアし、空きスロットとして表示する。 Further, a case where the action icon 115b has already been inserted in the operation target slot 115ai in step S105-1 will be described. In this case, it is assumed that the operation accepted in step S104-1 is an operation to instruct clearing of the slot 115ai (Yes in step S107-1). In this case, in step S108-1, the edit screen generation unit 117 clears the action icon 115b from the corresponding slot 115ai and displays it as an empty slot.

 また、ステップS105-1において、操作対象のスロット115aiが空きスロットであった場合について説明する。この場合、ステップS109-1において、編集画面生成部117は、当該スロット115aiに応じたアクションアイコン115bの一覧を表示する。一覧に含まれるアクションアイコン115bは、対応する保有キャラクタ及び同種別の他の保有キャラクタが取得済みのアクションのうち、当該スロット115aiに対応する操作の種類に応じた特性を有するアクションを表す。 Further, the case where the operation target slot 115ai is an empty slot in step S105-1 will be described. In this case, in step S109-1, the edit screen generation unit 117 displays a list of action icons 115b corresponding to the slot 115ai. The action icon 115b included in the list represents an action having a characteristic corresponding to the type of operation corresponding to the slot 115ai among the actions already acquired by the corresponding owned character and other owned characters of the same type.

 ステップS110-1において、操作受付部111-1は、アクションアイコン115bの一覧の何れか1つを、スロット115aiまでドラッグする操作を受け付ける。ドラッグ先として有効なスロット115aiは、ステップS104-1で操作された操作対象のスロット115aiと同一の操作列コンポーネント115aに含まれる何れかのスロット115aiであるものとする。 In step S110-1, the operation receiving unit 111-1 receives an operation of dragging any one of the list of action icons 115b to the slot 115ai. It is assumed that the slot 115ai effective as a drag destination is any slot 115ai included in the same operation sequence component 115a as the operation target slot 115ai operated in step S104-1.

 ステップS111-1において、編集画面生成部117は、ドラッグ操作されたアクションアイコン115bを、ドラッグ先のスロット115aiに嵌め込んだ画像を表示する。 In step S111-1, the edit screen generation unit 117 displays an image in which the dragged action icon 115b is inserted in the drag-destination slot 115ai.

 ステップS112-1において、編集終了を指示する操作が受け付けられていなければ、編集画面生成部117は、ステップS104-1からの動作を繰り返す。 If it is determined in step S112-1 that an operation for instructing the end of editing has not been accepted, the edit screen generation unit 117 repeats the operation from step S104-1.

 ステップS112-1において、編集終了を指示する操作が受け付けられた場合について説明する。この場合、ステップS113-1において、編集画面生成部117は、コストの総和が上限値を超えるか否かを判断する。具体的には、編集画面生成部117は、各種類の操作列コンポーネント115aに含まれる各スロット115aiに嵌め込まれたアクションアイコン115bのコストの合計を算出し、算出した値が上限値を超えるか否かを判断すればよい。 A case will be described in which an operation for instructing the end of editing is accepted in step S112-1. In this case, in step S113-1, the editing screen generation unit 117 determines whether or not the total cost exceeds the upper limit value. Specifically, the edit screen generation unit 117 calculates the total cost of the action icon 115b fitted in each slot 115ai included in each type of operation sequence component 115a, and whether or not the calculated value exceeds the upper limit value. You just have to judge.

 ステップS113-1において、コストの総和が上限値を超える場合、編集画面生成部117は、その旨を表す通知を表示する。そして、編集画面生成部117は、ステップS104-1からの動作を繰り返す。 In step S113-1, when the total cost exceeds the upper limit value, the edit screen generation unit 117 displays a notification indicating that effect. Then, the edit screen generation unit 117 repeats the operation from step S104-1.

 ステップS113-1において、コストの総和が上限値以下である場合について説明する。この場合、ステップS114-1において、編集画面生成部117は、各操作列コンポーネント115aにおける編集内容を反映させた操作列情報を、記憶部120-1に保存する。 A case will be described where the total cost is equal to or less than the upper limit value in step S113-1. In this case, in step S114-1, the edit screen generation unit 117 stores operation sequence information reflecting the edit contents in each operation sequence component 115a in the storage unit 120-1.

 ステップS115-1において、表示制御部112-1は、編集画面を閉じてゲーム画面を表示する。 In step S115-1, the display control unit 112-1 closes the editing screen and displays the game screen.

  (編集画面例)
 次に、編集画面の一例について説明する。図12(A)(B)(C)は、編集画面の一例を表す図である。
(Edit screen example)
Next, an example of the edit screen will be described. 12A, 12B, and 12C are diagrams illustrating examples of the edit screen.

 図12(A)に示す編集画面は、ある保有キャラクタについて、操作列情報の編集開始が指示されたときに表示される(ステップS103-1)。この編集画面は、操作の各種類の操作列コンポーネント115aを含む。例えば、タップ操作の操作列コンポーネント115aにおいて、スロット115a1~115a4には、アクションA3、A6、A1、A5を表すアクションアイコン115bが嵌め込まれている。また、スロット115a5は、編集不可能であることを表すよう、カギマークが表示されている。 The editing screen shown in FIG. 12A is displayed when an instruction to start editing operation sequence information is given for a certain retained character (step S103-1). This edit screen includes an operation sequence component 115a for each type of operation. For example, in the operation column component 115a for the tap operation, action icons 115b representing actions A3, A6, A1, and A5 are fitted in the slots 115a1 to 115a4. In addition, a key mark is displayed in the slot 115a5 to indicate that editing is impossible.

 図12(B)に示す編集画面は、図12(A)の編集画面において、タップ操作の操作列コンポーネント115aに含まれるスロット115a2が枠外にドラッグ操作されたときに表示されたものである(ステップS104-1、S105-1で「アクション有り」、S107-1でYes、S108-1)。すなわち、この操作により、タップ操作のスロット115a2は、空きスロットとなる。また、空きスロットとなったスロット115a2が選択されているため、タップ操作に応じたアクションアイコン115bの一覧が追加して表示されている(ステップS105-1で「空きスロット」、S109-1)。 The edit screen shown in FIG. 12B is displayed when the slot 115a2 included in the operation column component 115a of the tap operation is dragged out of the frame on the edit screen of FIG. “There is an action” in S104-1, S105-1, Yes in S107-1, S108-1). That is, as a result of this operation, the slot 115a2 for the tap operation becomes an empty slot. In addition, since the slot 115a2 which has become an empty slot is selected, a list of action icons 115b corresponding to the tap operation is additionally displayed (“empty slot” in step S105-1, S109-1).

 図12(C)に示す編集画面は、図12(B)に示す編集画面において、アクションA4を表すアクションアイコン115bが、タップ操作のスロット115a2までドラッグされた場合に表示される。すなわち、このドラッグ操作により、アクションA4を表すアクションアイコン115bが、タップ操作のスロット115a2に嵌め込まれている(ステップS110-1、S111-1)。 The edit screen shown in FIG. 12C is displayed when the action icon 115b representing the action A4 is dragged to the slot 115a2 for the tap operation on the edit screen shown in FIG. That is, by this drag operation, the action icon 115b representing the action A4 is inserted into the tap operation slot 115a2 (steps S110-1 and S111-1).

 この状態で編集終了すると、各種類の操作列情報は、次のように保存される。すなわち、タップ操作の操作列情報において、1段目~4段目の操作に対して、アクションA3、A4、A1、A5が関連付けられる。左右フリックの操作列情報において、1段目の操作に対して、アクションC3が関連付けられる。下フリックの操作列情報においては、アクションが関連付けられない。回転操作の操作列情報において、1段目の操作に対してアクションE3が関連付けられる。 When editing is completed in this state, each type of operation sequence information is saved as follows. That is, actions A3, A4, A1, and A5 are associated with the operations in the first to fourth stages in the operation sequence information of the tap operation. In the left / right flick operation sequence information, the action C3 is associated with the first-stage operation. No action is associated with the operation sequence information of the lower flick. In the operation sequence information of the rotation operation, the action E3 is associated with the first-stage operation.

 (連続する操作に応じた処理)
 次に、タッチスクリーン15に対して連続する操作に応じた処理について、図13を参照して説明する。図13は、連続する操作に応じた処理の流れを示すフローチャートである。
(Processing according to continuous operation)
Next, processing according to continuous operations on the touch screen 15 will be described with reference to FIG. FIG. 13 is a flowchart showing the flow of processing according to successive operations.

 ステップS201-1において、操作受付部111-1がゲームを開始する操作を受け付けると、ゲーム実行部115は、ゲームを開始する。 In step S201-1, when the operation accepting unit 111-1 accepts an operation for starting a game, the game executing unit 115 starts the game.

 ステップS202-1において、ゲーム実行部115は、サーバ200から取得した各種ゲーム情報に基づいて、操作キャラクタが存在するゲーム空間を表すゲーム画面を生成する。そして、表示制御部112-1は、生成されたゲーム画面をタッチスクリーン15に表示する。 In step S202-1, the game execution unit 115 generates a game screen representing a game space in which an operation character exists based on various game information acquired from the server 200. Then, the display control unit 112-1 displays the generated game screen on the touch screen 15.

 ステップS203-1において、操作受付部111-1は、操作キャラクタについての操作列情報を記憶部120-1から読み込む。 In step S203-1, the operation reception unit 111-1 reads operation sequence information about the operation character from the storage unit 120-1.

 ステップS204-1において、操作受付部111-1が、タッチスクリーン15に対する操作を受け付けると、ステップS205-1において、連続操作判定部118は、受け付けられた今回の操作が、前回の操作に連続しているか否かを判断する。 In step S204-1, when the operation receiving unit 111-1 receives an operation on the touch screen 15, in step S205-1, the continuous operation determining unit 118 continues the received current operation to the previous operation. Judge whether or not.

 具体的には、連続操作判定部118は、今回の操作が、前回の操作に基づく操作キャラクタのアクションを表すモーションの終了前に受け付けられていた場合に、連続していると判断する。それ以外の場合、連続操作判定部118は、連続していないと判断する。なお、連続しているか否かの判断基準としては、これに限らず、他の基準も採用可能である。 Specifically, the continuous operation determination unit 118 determines that the current operation is continuous when the operation representing the action of the operation character based on the previous operation is received before the end of the motion. In other cases, the continuous operation determination unit 118 determines that they are not continuous. The criteria for determining whether or not they are continuous are not limited to this, and other criteria can also be adopted.

 ステップS205-1において、今回の操作が前回の操作に連続していないと判断した場合について説明する。この場合、ステップS206-1において、連続操作判定部118は、この操作キャラクタの操作列情報のうち、今回の操作の種類に関する操作列情報を参照する。そして、連続操作判定部118は、操作列情報の先頭の操作に関連付けられたアクションを、実行すべきアクションとして決定する。 A case will be described where it is determined in step S205-1 that the current operation is not continuous with the previous operation. In this case, in step S206-1, the continuous operation determination unit 118 refers to the operation sequence information regarding the type of operation of the current operation among the operation sequence information of the operation character. Then, the continuous operation determination unit 118 determines an action associated with the first operation in the operation sequence information as an action to be executed.

 一方、ステップS205-1において、今回の操作が前回の操作に連続していると判断した場合について説明する。この場合、ステップS207-1において、連続操作判定部118は、今回の操作の種類と、前回の操作の種類とが、同一であるか否かを判断する。 On the other hand, a case will be described where it is determined in step S205-1 that the current operation is continuous with the previous operation. In this case, in step S207-1, the continuous operation determination unit 118 determines whether or not the type of the current operation and the type of the previous operation are the same.

 ステップS207-1において、同一の種類でないと判断した場合、ステップS206-1が実行される。これにより、今回の種類の操作列情報において先頭の操作に関連付けられたアクションが、実行すべきアクションとして決定される。 If it is determined in step S207-1 that they are not of the same type, step S206-1 is executed. As a result, the action associated with the first operation in this type of operation sequence information is determined as the action to be executed.

 ステップS207-1において、同一の種類であると判断した場合、ステップS208-1が実行される。 If it is determined in step S207-1 that they are of the same type, step S208-1 is executed.

 ステップS208-1において、連続操作判定部118は、今回の操作が、この種類の操作の何段目であるか(すなわち、順序)を特定する。 In step S208-1, the continuous operation determination unit 118 specifies the number of steps of this type of operation (that is, the order).

 ステップS209-1において、連続操作判定部118は、ステップS208-1で特定した順序が、この種類の操作の編集可能数M以下であるか否かを判断する。 In step S209-1, the continuous operation determination unit 118 determines whether or not the order specified in step S208-1 is equal to or less than the editable number M of this type of operation.

 ステップS209-1において、操作の順序が編集可能数Mを超えている場合、実行すべきアクションが決定されることなく、ステップS204-1からの操作が繰り返される。すなわち、編集可能数Mを超えて連続する同種の操作は、前回の操作に連続していないと判定されるまで、無効となる。 In step S209-1, if the order of operations exceeds the editable number M, the operation from step S204-1 is repeated without determining the action to be executed. That is, the same type of operation that exceeds the editable number M is invalid until it is determined that it is not continuous with the previous operation.

 ステップS209-1において、操作の順序が編集可能数M以下である場合について説明する。この場合、ステップS210-1において、連続操作判定部118は、今回の種類の操作列情報において、ステップS208-1で特定した順序の操作に関連付けられたアクションを、実行すべきアクションとして決定する。 The case where the order of operations is the editable number M or less in step S209-1 will be described. In this case, in step S210-1, the continuous operation determination unit 118 determines an action associated with the operation in the order specified in step S208-1 as an action to be executed in the operation sequence information of this type.

 ステップS211-1において、ゲーム実行部115は、操作キャラクタに、決定されたアクションを実行させるよう、各部に通知する。具体的には、アニメーション生成部114-1は、操作キャラクタが、決定されたアクションのモーションを行うアニメーションを生成する。また、表示制御部112-1は、生成されたアニメーションを表示する。なお、ゲーム実行部115は、操作キャラクタが実行中の前回のアクションに基づく表示を途中で取りやめて、今回決定されたアクションに基づく表示を行うよう各部を制御する。このとき、この時点で表示されているモーションが前回のアクションの戻りモーションであれば、ゲーム実行部115は、戻りモーションの表示を直ちにキャンセルして今回のアクションのモーションを表示するよう制御する。また、この時点で表示されているモーションが前回のアクションの攻撃モーションであれば、ゲーム実行部115は、攻撃モーションの表示が終了するまで待機してから戻りモーションの表示をキャンセルして今回のアクションのモーションを表示するよう制御してもよい。あるいは、この場合、ゲーム実行部115は、攻撃モーションの表示をキャンセルして今回のアクションのモーションを表示するよう制御してもよい。 In step S211-1, the game executing unit 115 notifies each unit to cause the operating character to execute the determined action. Specifically, the animation generation unit 114-1 generates an animation in which the operation character performs a motion of the determined action. Also, the display control unit 112-1 displays the generated animation. Note that the game execution unit 115 controls each unit to cancel the display based on the previous action being executed by the operating character, and to perform the display based on the action determined this time. At this time, if the motion displayed at this time is the return motion of the previous action, the game execution unit 115 controls to immediately cancel the display of the return motion and display the motion of the current action. If the motion displayed at this time is the attack motion of the previous action, the game execution unit 115 waits until the display of the attack motion is finished, cancels the display of the return motion, and cancels the current motion. You may control to display the motion. Alternatively, in this case, the game execution unit 115 may control to cancel the display of the attack motion and display the motion of the current action.

 そして、制御部110-1は、ステップS204-1からの動作を繰り返す。 And the control part 110-1 repeats the operation | movement from step S204-1.

 以上で、連続する操作に応じた処理の説明を終了する。 This completes the description of the processing according to the continuous operation.

 例えば、図12(C)に例示した編集画面により操作列情報が保存されているとする。この場合に、例えば、タップ操作、タップ操作、左右フリック操作の順に連続して操作が受け付けられたとする。すると、1段目のタップ操作に関連付けられたアクションA3、2段目のタップ操作に関連付けられたアクションA4、1段目の左右フリック操作に関連付けられたアクションC3、の順にアクションが繰り出されることになる。また、タップ操作が連続して5回以上受け付けられたとする。この場合、タップ操作の操作列情報において、4段目までしかアクションが関連付けられていない。すなわち、編集可能数M=4である。したがって、この場合、アクションA3、A4、A1、A5の順に4段目までアクションが繰り出される。その後、4段目のアクションA5が敵キャラクタを攻撃するアクションであるとすると、5回目以降の連続するタップ操作は、アクションA5の戻りモーションが終了するまで、無効となる。 For example, it is assumed that the operation sequence information is stored on the editing screen illustrated in FIG. In this case, for example, it is assumed that the operation is continuously received in the order of the tap operation, the tap operation, and the left / right flick operation. Then, the action is sent out in the order of action A3 associated with the first-stage tap operation, action A4 associated with the second-stage tap operation, and action C3 associated with the first-stage left / right flick operation. Become. Further, it is assumed that the tap operation is continuously received five times or more. In this case, in the operation sequence information of the tap operation, actions are associated only up to the fourth level. That is, the editable number M = 4. Therefore, in this case, actions are sent out to the fourth level in the order of actions A3, A4, A1, and A5. Thereafter, if the fourth-stage action A5 is an action that attacks the enemy character, the fifth and subsequent tap operations are invalid until the return motion of the action A5 ends.

 <本実施形態の効果>
 このように、本実施形態では、操作の種類毎に操作列情報を編集可能とし、連続した操作を受け付けた場合には、編集された操作列情報にしたがって決定したアクションを操作キャラクタに実行させる。このため、本実施形態は、連続する操作に応じたゲーム展開を、ユーザが試行錯誤しながら変化させることができる。これにより、ユーザは、連続した操作に応じて多様なゲーム展開を楽しむことができ、興趣性がより高まる。
<Effect of this embodiment>
As described above, in this embodiment, the operation sequence information can be edited for each type of operation, and when a continuous operation is received, the operation character is caused to execute an action determined according to the edited operation sequence information. For this reason, this embodiment can change the game development according to continuous operation, while a user is trial and error. Thereby, the user can enjoy various game development according to the continuous operation, and the interest is further enhanced.

 また、本実施形態では、保有キャラクタ毎の編集画面において、操作列に含まれる各操作に対して関連付け可能なアクションは、当該保有キャラクタによって取得済みのアクションまたは同種別の他の保有キャラクタによって取得済みのアクションである。このため、本実施形態は、その保有キャラクタの世界観またはその種別の世界観にあったアクションを、当該保有キャラクタに行わせることができる。 Further, in the present embodiment, in the editing screen for each possessed character, the actions that can be associated with each operation included in the operation sequence have been obtained by the action already acquired by the owned character or another owned character of the same type. Is the action. For this reason, this embodiment can make the said character perform the action which suited the world view of the possessed character, or the kind of world view.

 また、本実施形態では、保有キャラクタ毎の編集画面において、操作列に含まれる各操作に対して関連付け可能なアクションは、保有キャラクタが満たす条件に応じて追加される。また、操作列において編集可能な操作数は、保有キャラクタが満たす条件に応じて増加する。このため、本実施形態は、ゲームプレイをすればするほど、保有キャラクタに関する操作列情報をより有利に編集して、保有キャラクタを強くすることができる。 In the present embodiment, in the editing screen for each possessed character, actions that can be associated with each operation included in the operation sequence are added according to the conditions satisfied by the retained character. Also, the number of operations that can be edited in the operation sequence increases according to the conditions satisfied by the possessed character. For this reason, this embodiment can edit operation sequence information about a possessed character more advantageously, and can strengthen a possessed character, so that a game is played.

 〔変形例1:お勧め操作列情報の自動設定〕
 本実施形態では、編集画面生成部117が生成する編集画面は、操作列情報を編集する機能を含むものとして説明した。さらに、編集画面は、お勧め操作列情報を自動設定する機能を含んでいてもよい。お勧め操作列情報とは、操作列に含まれる各操作に対して、予め推奨されるアクションを関連付けた操作列情報である。例えば、編集画面は、お勧め操作列情報の自動設定を指示する操作に応じて、お勧め操作列情報に基づいて、操作列に含まれる各操作に対して推奨されるアクションを一括して関連付ける。具体的には、操作列コンポーネント115aのスロット115aiに、お勧め操作列情報にしたがったアクションアイコン115bが一括して自動で嵌め込まれてもよい。
[Variation 1: Automatic setting of recommended operation sequence information]
In the present embodiment, the editing screen generated by the editing screen generation unit 117 has been described as including a function for editing operation sequence information. Furthermore, the edit screen may include a function for automatically setting recommended operation sequence information. The recommended operation sequence information is operation sequence information in which a recommended action is associated with each operation included in the operation sequence. For example, the edit screen associates the recommended actions for each operation included in the operation sequence in a batch based on the recommended operation sequence information in response to an operation that instructs automatic setting of the recommended operation sequence information. . Specifically, the action icons 115b according to the recommended operation sequence information may be automatically and collectively inserted into the slots 115ai of the operation sequence component 115a.

 また、編集画面は、複数のお勧め操作列情報の何れかを選択させて自動設定を行う機能を含んでいてもよい。具体的には、複数のお勧め操作列情報は、それぞれ異なる特性を有するよう構成されていてもよい。この場合、ユーザは、特性を考慮してお勧め操作列情報を選択可能である。 In addition, the editing screen may include a function for selecting any of a plurality of recommended operation sequence information and performing automatic setting. Specifically, the plurality of recommended operation sequence information may be configured to have different characteristics. In this case, the user can select recommended operation sequence information in consideration of characteristics.

 図14は、オススメ操作列情報の自動設定を行う編集画面の一例である。図14(A)の編集画面は、「オススメ」ボタン115dを含む。「オススメ」ボタン115dは、お勧め操作列情報の自動設定を指示する操作を受け付けるボタンである。 FIG. 14 is an example of an edit screen for automatically setting recommended operation sequence information. The editing screen of FIG. 14A includes a “recommend” button 115d. The “recommend” button 115d is a button for accepting an operation for instructing automatic setting of recommended operation sequence information.

 図14(B)は、図14(A)の編集画面において、「オススメ」ボタン115dが操作されると表示されるお勧め操作列情報の選択画面の一例である。この例では、「オススメ1」のお勧め操作列情報は、最後の決め技の威力を重視した構成となっている。また、「オススメ2」のお勧め操作列情報は、連続して繰り出されるアクションの間に隙が生じないことに重点を置いた構成となっている。 FIG. 14B is an example of a recommended operation sequence information selection screen that is displayed when the “recommend” button 115d is operated on the editing screen of FIG. 14A. In this example, the recommended operation sequence information of “Recommendation 1” has a configuration that emphasizes the power of the final decision technique. Further, the recommended operation sequence information of “Recommendation 2” has a configuration that focuses on the fact that no gap is generated between actions that are continuously delivered.

 図14(C)は、図14(B)の編集画面において、「オススメ1」が選択されて自動設定が行われた編集画面の一例である。この例では、「オススメ1」における各種類の操作列情報にしたがって、該当するアクションアイコン115bが、該当するスロット115aiに、自動的に嵌め込まれている。 FIG. 14C is an example of an editing screen in which “Recommendation 1” is selected and automatic setting is performed on the editing screen of FIG. 14B. In this example, the corresponding action icon 115b is automatically fitted in the corresponding slot 115ai in accordance with each type of operation sequence information in “Recommendation 1”.

 また、編集画面は、自動設定されるお勧め操作列情報に応じて、推奨される連続操作のお勧めパターンを提示する機能をさらに含んでいてもよい。連続操作のお勧めパターン
とは、例えば、「タップ、タップ、タップ、上フリック」等といったように、連続する操作として推奨される操作の種類及び順序を表す情報である。ユーザは、お勧め操作列情報を自動設定した上で、お勧めパターンの連続操作を行うことにより、例えば、最後の決め技まで連続して繰り出すアクションを効果的につなげることができ、より興趣性が向上する。
The editing screen may further include a function of presenting a recommended pattern for recommended continuous operation according to automatically set recommended operation sequence information. The recommended pattern of continuous operation is information indicating the type and order of operations recommended as continuous operations, such as “tap, tap, tap, upper flick”, and the like. The user can set the recommended operation sequence information automatically, and then perform the recommended pattern continuously.For example, the user can effectively connect actions that are continuously played until the final decision. Will improve.

 また、編集画面は、お勧め操作列情報の構成をさらに強化可能な未取得のアクションがあれば、当該アクションを保有キャラクタに取得させる条件を提示する機能をさらに含んでいてもよい。取得する条件とは、例えば、ゲーム空間内で所定のイベントに参加して所定のミッションをクリアすることであってもよいが、これに限られない。また、編集画面は、ユーザが新たに獲得したアクションがある場合に、当該アクションを含めるように、連続操作のお勧めパターンを提示する機能を有してもよい。例えば、ユーザが新たに獲得したアクションがあり、当該アクションが操作列情報に関連付けられたことがないことを想定する。この場合に、編集画面において、当該アクションを含む連続操作のお勧めパターンが提示されてもよい。 In addition, the edit screen may further include a function for presenting a condition for causing the possessed character to acquire the action if there is an unacquired action that can further enhance the configuration of the recommended operation sequence information. The acquisition condition may be, for example, to participate in a predetermined event in the game space and clear a predetermined mission, but is not limited thereto. Further, when there is an action newly acquired by the user, the editing screen may have a function of presenting a recommended pattern for continuous operation so as to include the action. For example, it is assumed that there is an action newly acquired by the user and the action has never been associated with the operation sequence information. In this case, a recommended pattern for continuous operation including the action may be presented on the editing screen.

 本変形例では、ユーザは、お勧め操作列情報を利用することにより、お勧めの操作列情報に基づくゲーム展開を手軽に楽しむことができる。また、ユーザは、そのようなお勧め操作列情報を設定した場合に有効な連続する操作のパターンを、容易に習得することができる。また、そのようなお勧め操作列情報の設定に必要なアクションを取得するために、ゲームをプレイすることに対するユーザの動機付けが強化される。 In this modification, the user can easily enjoy the game development based on the recommended operation sequence information by using the recommended operation sequence information. In addition, the user can easily learn a pattern of continuous operations effective when such recommended operation sequence information is set. In addition, in order to acquire an action necessary for setting such recommended operation sequence information, the motivation of the user for playing the game is strengthened.

 〔変形例2:操作列情報の共有〕
 本実施形態では、操作列情報は、ユーザ毎に編集されるものとして説明した。さらに、編集画面生成部117が生成する編集画面は、他のユーザによって編集された操作列情報に基づいて自動設定を行う機能を含むよう変形可能である。例えば、編集画面は、当該ユーザが編集した任意の保有キャラクタの操作列情報を、他のユーザに対して公開する機能を有していてもよい。また、例えば、編集画面は、当該ユーザが編集する任意の保有キャラクタの編集画面において、他のユーザによって公開されている操作列情報を取得して自動設定する機能を有していてもよい。このとき、例えば、操作列情報を公開する機能は、操作列情報とともにユーザのコメントを公開する機能を含んでいてもよい。また、この場合、他のユーザの操作列情報を自動設定する機能は、1つ以上の他のユーザの操作列情報とともに公開されたコメントをユーザに提示した上で、自動設定する操作列情報を選択させる機能を含んでいてもよい。
[Variation 2: Sharing operation sequence information]
In the present embodiment, the operation sequence information has been described as being edited for each user. Furthermore, the edit screen generated by the edit screen generation unit 117 can be modified to include a function for performing automatic setting based on operation sequence information edited by another user. For example, the edit screen may have a function of disclosing operation sequence information of any possessed character edited by the user to other users. Further, for example, the editing screen may have a function of acquiring and automatically setting operation sequence information published by other users on the editing screen of an arbitrary possessed character edited by the user. At this time, for example, the function of publishing operation sequence information may include a function of publishing user comments together with operation sequence information. In this case, the function for automatically setting the operation sequence information of other users is to display the comment that has been released together with the operation sequence information of one or more other users to the user, and then to set the operation sequence information to be automatically set. A function to be selected may be included.

 図15は、このように変形した場合の編集画面の一例である。図15(A)に示す編集画面は、公開ボタン115eと、取得ボタン115fを含む。公開ボタン115eが操作されると、編集画面生成部117は、編集された操作列情報を、公開対象として、サーバ200に送信する。 FIG. 15 shows an example of the editing screen when the transformation is performed as described above. The editing screen shown in FIG. 15A includes a disclosure button 115e and an acquisition button 115f. When the publish button 115e is operated, the edit screen generation unit 117 transmits the edited operation sequence information to the server 200 as a publish target.

 図15(B)は、図15(A)の編集画面において、取得ボタン115fが操作された場合に表示される選択画面の一例である。編集画面生成部117は、取得ボタン115fに対する操作に応じて、サーバ200から、公開された操作列情報の一覧を取得して、選択可能に表示する。この例では、ユーザ1によって公開された操作列情報1と、ユーザ2によって公開された操作列情報2とが、選択可能に表示されている。この場合、一覧から何れかの操作列情報が選択されると、編集画面生成部117は、公開された操作列情報の一覧から選択された操作列情報に基づいて、該当するアクションアイコン115bを該当するスロット115aiに自動設定すればよい。 FIG. 15B is an example of a selection screen displayed when the acquisition button 115f is operated on the editing screen of FIG. The edit screen generation unit 117 acquires a list of published operation sequence information from the server 200 in response to an operation on the acquisition button 115f, and displays it in a selectable manner. In this example, operation sequence information 1 published by the user 1 and operation sequence information 2 published by the user 2 are displayed in a selectable manner. In this case, when any operation sequence information is selected from the list, the edit screen generation unit 117 selects the corresponding action icon 115b based on the operation sequence information selected from the published operation sequence information list. It is sufficient to automatically set the slot 115ai to be set.

 本変形例では、ユーザは、他のユーザとの間で操作列情報を共有することにより、他のユーザにより設定された操作列情報に基づくゲーム展開を楽しむことができる。 In this modification, the user can enjoy the game development based on the operation sequence information set by other users by sharing the operation sequence information with other users.

 〔変形例3〕
 本実施形態では、操作列は、同種の操作よりなるものとして説明した。これに限らず、本実施形態は、操作列が、1つ以上の種類の操作からなるよう変形することが可能である。図16は、本変形例における編集画面の一例である。図16に示すように、編集画面は、1つ以上の操作列コンポーネント315aを含む。各操作列コンポーネント315aは、スロット315aiを含む。また、編集画面の下部には、アクションアイコン315bの一覧が表示される。アクションアイコン315bは、上述したアクションアイコン115bに対して、アクションの名称に加えて、関連付け可能な操作の種類(例えば、タップ)が表示されている点が異なる。その他の点については、アクションアイコン115bと同様に構成されるため、詳細な説明を繰り返さない。また、操作列コンポーネント315aは、上述した操作列コンポーネント115aに対して、含まれる各スロット315aiに嵌め込み可能なアクションアイコン315bが異なる操作の種類に対応するものであってもよい点が異なる。その他の点については、操作列コンポーネント115a同様に構成されるため、詳細な説明を繰り返さない。
[Modification 3]
In the present embodiment, the operation sequence has been described as including the same kind of operations. However, the present embodiment is not limited to this, and the operation sequence can be modified to include one or more types of operations. FIG. 16 is an example of an editing screen in this modification. As shown in FIG. 16, the edit screen includes one or more operation sequence components 315a. Each operation sequence component 315a includes a slot 315ai. A list of action icons 315b is displayed at the bottom of the editing screen. The action icon 315b differs from the above-described action icon 115b in that the type of operation (for example, tap) that can be associated is displayed in addition to the name of the action. Since the other points are configured in the same manner as the action icon 115b, detailed description will not be repeated. Further, the operation sequence component 315a is different from the above-described operation sequence component 115a in that the action icons 315b that can be fitted in the respective slots 315ai may correspond to different types of operations. Since the other points are configured in the same manner as the operation sequence component 115a, detailed description will not be repeated.

 この場合、操作列コンポーネント315aの編集内容を反映して保存される操作列情報は、1~N段目までの操作について、操作の種類及びアクションの組を関連付けた情報となる。このような操作列情報を用いる場合、連続操作判定部118は、図13に示した処理において、ステップS207-1の処理をスキップする。また、ステップS210-1において、今回の操作の種類に関する操作列情報を参照する代わりに、それまでに受け付けられた操作の種類の順序に対応する操作列情報を参照すればよい。 In this case, the operation sequence information saved by reflecting the edited content of the operation sequence component 315a is information in which the operation type and the action set are associated with the operations from the 1st to the Nth stages. When such operation sequence information is used, the continuous operation determination unit 118 skips the process of step S207-1 in the process shown in FIG. In step S210-1, instead of referring to the operation sequence information related to the type of operation this time, it is only necessary to refer to operation sequence information corresponding to the order of the types of operations received so far.

 〔ソフトウェアによる実現例〕
 制御部210、ならびに、制御部110-1の制御ブロック(特に、操作受付部111-1、表示制御部112-1、UI制御部113-1、アニメーション生成部114-1、ゲーム実行部115、カメラ配置制御部116、編集画面生成部117及び連続操作判定部118)は、集積回路(ICチップ)等に形成された論理回路(ハードウェア)によって実現してもよいし、CPU(Central Processing Unit)を用いてソフトウェアによって実現してもよい。
[Example of software implementation]
Control unit 210 and control blocks of control unit 110-1 (in particular, operation receiving unit 111-1, display control unit 112-1, UI control unit 113-1, animation generation unit 114-1, game execution unit 115, The camera arrangement control unit 116, the edit screen generation unit 117, and the continuous operation determination unit 118) may be realized by a logic circuit (hardware) formed in an integrated circuit (IC chip) or the like, or may be a CPU (Central Processing Unit). ) May be implemented by software.

 後者の場合、制御部210または制御部110-1、もしくはその両方を備えた情報処理装置は、各機能を実現するソフトウェアであるプログラムの命令を実行するCPU、上記プログラムおよび各種データがコンピュータ(またはCPU)で読み取り可能に記録されたROM(Read Only Memory)または記憶装置(これらを「記録媒体」と称する)、上記プログラムを展開するRAM(Random Access Memory)などを備えている。そして、コンピュータ(またはCPU)が上記プログラムを上記記録媒体から読み取って実行することにより、本発明の目的が達成される。上記記録媒体としては、「一時的でない有形の媒体」、例えば、テープ、ディスク、カード、半導体メモリ、プログラマブルな論理回路などを用いることができる。また、上記プログラムは、該プログラムを伝送可能な任意の伝送媒体(通信ネットワークや放送波等)を介して上記コンピュータに供給されてもよい。なお、本発明の一態様は、上記プログラムが電子的な伝送によって具現化された、搬送波に埋め込まれたデータ信号の形態でも実現され得る。 In the latter case, the information processing apparatus including the control unit 210 and / or the control unit 110-1 has a CPU that executes instructions of a program that is software that realizes each function, and the program and various data are stored in a computer (or A ROM (Read Only Memory) or a storage device (referred to as “recording medium”) recorded so as to be readable by a CPU), a RAM (Random Access Memory) for expanding the program, and the like are provided. And the objective of this invention is achieved when a computer (or CPU) reads the said program from the said recording medium and runs it. As the recording medium, a “non-temporary tangible medium” such as a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used. The program may be supplied to the computer via an arbitrary transmission medium (such as a communication network or a broadcast wave) that can transmit the program. Note that one embodiment of the present invention can also be realized in the form of a data signal embedded in a carrier wave, in which the program is embodied by electronic transmission.

 本発明は上述した各実施形態に限定されるものではなく、請求項に示した範囲で種々の変更が可能であり、異なる実施形態にそれぞれ開示された技術的手段を適宜組み合わせて得られる実施形態についても本発明の技術的範囲に含まれる。 The present invention is not limited to the above-described embodiments, and various modifications are possible within the scope shown in the claims, and embodiments obtained by appropriately combining technical means disclosed in different embodiments. Is also included in the technical scope of the present invention.

 〔付記事項1〕
 本開示の内容を列記すると以下の通りである。
[Appendix 1]
The contents of the present disclosure are listed as follows.

 (項目1)本実施形態において、プロセッサ(10)と、メモリ(11)と、タッチスクリーン(15)とを備えるコンピュータ(100)において実行されるゲームプログラムについて説明した。本実施形態のある局面によると、ゲームプログラムは、プロセッサ(10)に、タッチスクリーン(15)に対する入力操作を受け付けるステップ(ステップS103)と、ゲーム空間における仮想カメラの配置を制御するステップ(ステップS102)と、仮想カメラの撮影画像をタッチスクリーン(15)に表示させるステップ(ステップS102)と、タッチスクリーン(15)に対するタッチ位置を第1の位置(L1)から第2の位置(L2)へ移動させることにより入力される第1の入力操作を受け付けた場合に、第1の位置(L1)と第2の位置(L2)とにより定まるタッチ操作の方向と、タッチスクリーン(15)に表示される画面においてゲームキャラクタ(C)が向いている方向とを比較し、比較結果に応じたアクションを、ゲームキャラクタ(C)に実行させるステップ(ステップS108)と、を実行させる。アクションゲームにおいて、ゲームキャラクタの向いている方向とタッチ操作の方向とに応じてアクションを分けることができるので、ゲームの趣向性が向上する。 (Item 1) In this embodiment, the game program executed in the computer (100) including the processor (10), the memory (11), and the touch screen (15) has been described. According to an aspect of the present embodiment, the game program accepts an input operation to the touch screen (15) to the processor (10) (step S103), and controls the placement of the virtual camera in the game space (step S102). ), A step of displaying a captured image of the virtual camera on the touch screen (15) (step S102), and a touch position on the touch screen (15) is moved from the first position (L1) to the second position (L2). When the first input operation that is input is received, the direction of the touch operation determined by the first position (L1) and the second position (L2) and the touch screen (15) are displayed. Compare the direction in which the game character (C) is facing on the screen, and take action according to the comparison result. And a step (step S108) to be executed by the game character (C), is running. In the action game, since the actions can be divided according to the direction in which the game character is facing and the direction of the touch operation, the game preference is improved.

 (項目2) (項目1)において、実行させるステップは、第1の入力操作を受け付けた場合に、タッチ操作の方向が、ゲームキャラクタ(C)が向いている方向により定まる一定範囲の方向に含まれていれば、他のオブジェクトに作用を及ぼすためのアクションを、ゲームキャラクタに実行させることを含む。ゲームキャラクタに攻撃動作を実行させるための操作がより多様となる。 (Item 2) In (Item 1), the step to be executed includes the direction of the touch operation included in a certain range of directions determined by the direction in which the game character (C) faces when the first input operation is received. If so, this includes causing the game character to perform an action for acting on another object. The operations for causing the game character to execute an attacking action become more diverse.

 (項目3) (項目2)において、実行させるステップは、第1の入力操作を受け付けており、第1の入力操作を受け付ける前の一定期間内に、タッチスクリーン(15)に対するタッチ位置を変化させずに入力される第2の入力操作を受け付けておらず、かつ、タッチ操作の方向が、一定範囲の方向に含まれていない場合に、他のオブジェクトに作用を及ぼすためのアクションを実行することなくタッチ操作の方向へ移動するためのアクションを、ゲームキャラクタ(C)に実行させることと、第1の入力操作を受け付けており、第1の入力操作を受け付ける前の一定期間内に第2の入力操作を受け付けており、かつ、タッチ操作の方向が、一定範囲の方向に含まれていない場合に、他のオブジェクトに作用を及ぼすためのアクションを実行することなくタッチ操作の方向へ移動するためのアクションを、ゲームキャラクタ(C)に実行させ、それに続けて、他のオブジェクトに作用を及ぼすためのアクションを、ゲームキャラクタ(C)に実行させることとを含む。ゲームキャラクタに多様な動作を実行させることができるため、ゲームの趣向性が向上する。 (Item 3) In (Item 2), the step to be executed accepts the first input operation, and changes the touch position on the touch screen (15) within a certain period before accepting the first input operation. When the second input operation input without being received is not received and the direction of the touch operation is not included in the range of the certain range, an action for acting on another object is executed. Without causing the game character (C) to perform an action for moving in the direction of the touch operation and the first input operation, and the second input within a certain period before the first input operation is received. When an input operation is accepted and the direction of the touch operation is not included in a certain range of directions, an action to affect other objects is performed. Causing the game character (C) to perform an action for moving in the direction of the touch operation without performing, and subsequently causing the game character (C) to perform an action for acting on another object. Including. Since various actions can be performed by the game character, the taste of the game is improved.

 (項目4) (項目3)において、実行させるステップは、第2の入力操作を受け付けた場合に、第2の入力操作に応じた他のアクションを、ゲームキャラクタ(C)に実行させることと、第1の入力操作を受け付けており、第1の入力操作を受け付ける前の一定期間内に第2の入力操作を受け付けており、かつ、タッチ操作の方向が、一定範囲の方向に含まれていない場合に、他のオブジェクトに作用を及ぼすためのアクションを実行することなくタッチ操作の方向へ移動するためのアクションを、ゲームキャラクタ(C)に実行させ、それに続けて、他のオブジェクトに作用を及ぼすための、他のアクションとは異なるアクションを、ゲームキャラクタ(C)に実行させることとを含む。多様な操作によって、ゲームキャラクタに攻撃および回避させることができるので、ゲームの趣向性が向上する。 (Item 4) In (Item 3), when executing the second input operation, the step to be executed causes the game character (C) to execute another action according to the second input operation; The first input operation is received, the second input operation is received within a certain period before the first input operation is received, and the direction of the touch operation is not included in a certain range of directions. In some cases, the game character (C) is caused to execute an action for moving in the direction of the touch operation without performing an action for acting on another object, and subsequently, the action is exerted on the other object. For causing the game character (C) to execute an action different from other actions. Since the game character can be attacked and avoided by various operations, the taste of the game is improved.

 (項目5) (項目1)~(項目4)の何れかにおいて、ゲームプログラムは、プロセッサ(10)に、タッチスクリーン(15)に対するタッチ位置を第3の位置から第4の位置へ移動させた後、タッチ位置を第4の位置に維持させることにより入力される第3の入力操作を受け付けた場合に、第3の位置と第4の位置とにより定まるタッチ操作の方向に移動するためのアクションを、ゲームキャラクタ(C)に実行させるステップをさらに実行させる。タッチ位置の維持を必要とする操作の場合、ゲームキャラクタに攻撃させるよりは移動させる方が、ユーザに操作内容とゲームキャラクタの反応とを直感的に認識させやすい。よって、ゲームの趣向性をより向上させることができる。 (Item 5) In any one of (Item 1) to (Item 4), the game program causes the processor (10) to move the touch position on the touch screen (15) from the third position to the fourth position. After that, when a third input operation input by maintaining the touch position at the fourth position is received, an action for moving in the direction of the touch operation determined by the third position and the fourth position Is further executed by the game character (C). In the case of an operation that requires the touch position to be maintained, it is easier for the user to intuitively recognize the operation content and the reaction of the game character than moving the game character to attack. Therefore, the game preference can be further improved.

 (項目6) (項目5)において、ゲームプログラムは、プロセッサ(10)に、ゲームキャラクタ(C)が向いている方向を、第3の入力操作に対応するタッチ操作の方向に基づいて定まる方向に変更するステップをさらに実行させる。ユーザがゲームキャラクタの向いている方向を基準に、攻撃させるための操作をしたり、他のアクションをさせるための操作をしたりする場合に、誤操作を発生し難くすることができる。 (Item 6) In (Item 5), the game program causes the processor (10) to determine the direction in which the game character (C) is facing based on the direction of the touch operation corresponding to the third input operation. The change step is further executed. When the user performs an operation for causing an attack or an operation for performing another action based on the direction in which the game character is facing, an erroneous operation can be made difficult to occur.

 (項目7) (項目5)または(項目6)において、ゲームプログラムは、プロセッサ(10)に、第3の入力操作を受け付けた場合に、第3の位置と第4の位置とにより定まるタッチ操作の方向を示すオブジェクトをタッチスクリーン(15)に表示させるステップをさらに実行させる。所望の方向にゲームキャラクタを移動させ易い。従って、ユーザビリティを向上させることができる。 (Item 7) In (Item 5) or (Item 6), when the game program accepts the third input operation to the processor (10), the touch operation is determined by the third position and the fourth position. The step of displaying on the touch screen (15) an object indicating the direction of the is further executed. It is easy to move the game character in a desired direction. Therefore, usability can be improved.

 (項目8) (項目5)~(項目7)の何れかにおいて、第1の入力操作はフリック操作であり、第3の入力操作はスワイプ操作である。より多様なアクションをゲームキャラクタに実行させることができる。 (Item 8) In any of (Item 5) to (Item 7), the first input operation is a flick operation, and the third input operation is a swipe operation. More various actions can be executed by the game character.

 (項目9) (項目8)において、実行させるステップは、フリック操作を受け付けた場合に、タッチ操作の方向が、ゲームキャラクタ(C)が向いている方向により定まる一定範囲の方向に含まれていれば、他のオブジェクトに作用を及ぼすためのアクションを、ゲームキャラクタ(C)に実行させることと、タッチスクリーン(15)に対するタッチ位置を変化させずに入力される第2の入力操作を受け付けた場合に、フリック操作を受け付けた場合に実行される他のオブジェクトに作用を及ぼすためのアクションとは異なる、他のオブジェクトに作用を及ぼすための他のアクションを、ゲームキャラクタ(C)に実行させることとを含む。多様な操作によって、ゲームキャラクタに攻撃動作を実行させることができるので、ゲームの趣向性が向上する。 (Item 9) In (Item 8), when the flick operation is accepted, the step to be executed is included in a certain range of directions determined by the direction in which the game character (C) is facing. For example, when the game character (C) is caused to execute an action for acting on another object and a second input operation that is input without changing the touch position on the touch screen (15) is received. And causing the game character (C) to execute another action for acting on another object, which is different from the action for acting on another object executed when a flick operation is received. including. Since the game character can execute an attacking action by various operations, the game preference is improved.

 (項目10) (項目9)において、実行させるステップは、フリック操作を受け付けた場合に、タッチ操作の方向が、ゲームキャラクタ(C)が向いている方向により定まる一定範囲の方向に含まれていなければ、他のオブジェクトに作用を及ぼすためのアクションを実行することなく移動するためのアクションを、ゲームキャラクタ(C)に実行させる。フリック操作は、素早い操作が必要とされるゲームにおいて、ユーザが直感的に素早く入力する操作であるので、直感的に攻撃と回避とを選択することができる。よって、ゲームの趣向性が向上する。 (Item 10) In (Item 9), when the flick operation is accepted, the step to be executed must include the direction of the touch operation within a certain range determined by the direction in which the game character (C) is facing. For example, the game character (C) is caused to perform an action for moving without performing an action for acting on another object. Since the flick operation is an operation that the user intuitively and quickly inputs in a game that requires a quick operation, it is possible to intuitively select between attack and avoidance. Therefore, the game preference is improved.

 (項目11) (項目2)において、実行させるステップは、タッチスクリーンに対するタッチ位置を変化させずに入力される第2の入力操作を受け付けた場合に、第2の入力操作に応じて、他のオブジェクトに作用を及ぼすための他のアクションをゲームキャラクタに実行させることと、第1の入力操作または第2の入力操作を受け付けることに応じてゲームキャラクタがアクションを実行開始してから実行終了するまでのアクション実行期間の少なくとも一部において第1の入力操作または第2の入力操作を受け付けることにより、アクションの実行終了に続けて、アクション実行期間の少なくとも一部において受け付けた入力操作に応じたアクションをゲームキャラクタに実行させることとを含む。 (Item 11) In (Item 2), when the second input operation input without changing the touch position with respect to the touch screen is received, another step is executed according to the second input operation. In response to causing the game character to perform another action for acting on the object and accepting the first input operation or the second input operation until the game character starts executing the action until the execution ends By accepting the first input operation or the second input operation in at least a part of the action execution period, an action corresponding to the input operation accepted in at least a part of the action execution period can be performed following the end of the execution of the action. Causing the game character to execute.

 (項目12) (項目11)において、実行させるステップは、ゲームキャラクタが第1の入力操作または第2の入力操作に応じて他のオブジェクトに作用を及ぼすためのアクションを実行開始してから実行終了した後に第1の入力操作を受け付けた場合に、タッチ操作の方向が、一定範囲の方向に含まれていないとき、他のオブジェクトに作用を及ぼすためのアクションを実行することなくタッチ操作の方向へ移動するためのアクションを、ゲームキャラクタに実行させることと、ゲームキャラクタが第1の入力操作または第2の入力操作に応じて他のオブジェクトに作用を及ぼすためのアクションを実行開始してから実行終了するまでのアクション実行期間の少なくとも一部において第1の入力操作を受け付けた場合に、タッチ操作の方向が、一定範囲の方向に含まれていないとき、他のオブジェクトに作用を及ぼすためのアクションを実行することなくタッチ操作の方向へ移動するためのアクションを、ゲームキャラクタに実行させ、それに続けて、他のオブジェクトに作用を及ぼすためのアクションを、ゲームキャラクタに実行させることを含む。 (Item 12) In (Item 11), the step to be executed is executed after the game character starts executing an action for acting on another object in accordance with the first input operation or the second input operation. When the first input operation is accepted after that, and the direction of the touch operation is not included in the direction of the certain range, the direction of the touch operation is performed without executing an action for acting on another object. Execute the action for moving by the game character, and start executing the action for the game character to act on other objects in response to the first input operation or the second input operation If the first input operation is accepted during at least part of the action execution period until When the direction is not included in a certain range of directions, the game character is caused to perform an action for moving in the direction of the touch operation without performing an action for acting on another object, and then And causing the game character to perform an action for acting on another object.

 (項目13) 実行させるステップは、他のオブジェクトに作用を及ぼすためのアクションを実行開始してから実行終了するまでのアクション実行期間の少なくとも一部において第1の入力操作または第2の入力操作を受け付けることにより、他のオブジェクトに作用を及ぼすためのアクションをゲームキャラクタに連続して実行させ、アクションの実行に応じて他のオブジェクトに作用を及ぼす都度、カウントアップする表示をタッチスクリーンに対して行うことを含む。 (Item 13) The step of executing the first input operation or the second input operation in at least a part of the action execution period from the start of execution of an action for acting on another object to the end of execution. By accepting, the game character continuously performs an action for acting on another object, and when the action is performed on the other object according to the execution of the action, a count-up display is performed on the touch screen. Including that.

 (項目14) (項目1)~(項目13)の何れかに記載のゲームプログラムを記録したコンピュータ読み取り可能な記録媒体。 (Item 14) A computer-readable recording medium on which the game program described in any one of (Item 1) to (Item 13) is recorded.

 (項目15)本実施形態において、コンピュータ(100)がゲームプログラムを実行するための方法について説明した。コンピュータは、プロセッサと、メモリと、タッチスクリーンとを備え、ゲームプログラムを実行するものである。本実施形態のある局面によると、方法は、コンピュータ(100)が、タッチスクリーン(15)に対する入力操作を受け付けるステップと、ゲーム空間における仮想カメラの配置を制御するステップと、仮想カメラの撮影画像をタッチスクリーン(15)に表示させるステップと、タッチスクリーン(15)に対するタッチ位置を第1の位置(L1)から第2の位置(L2)へ移動させることにより入力される第1の入力操作を受け付けた場合に、第1の位置と第2の位置とにより定まるタッチ操作の方向と、タッチスクリーン(15)に表示される画面においてゲームキャラクタが向いている方向とを比較し、比較結果に応じたアクションを、ゲームキャラクタ(C)に実行させるステップと、を実行することを含む。 (Item 15) In this embodiment, the method for the computer (100) to execute the game program has been described. The computer includes a processor, a memory, and a touch screen, and executes a game program. According to an aspect of the present embodiment, the method includes a step in which the computer (100) receives an input operation on the touch screen (15), a step of controlling the placement of the virtual camera in the game space, and a captured image of the virtual camera. The step of displaying on the touch screen (15) and the first input operation input by moving the touch position on the touch screen (15) from the first position (L1) to the second position (L2) are accepted. The direction of the touch operation determined by the first position and the second position is compared with the direction in which the game character is facing on the screen displayed on the touch screen (15), and according to the comparison result Performing an action on the game character (C).

 (項目16)本実施形態において、情報処理装置(100)について説明した。情報処理装置(100)は、ゲームプログラムを記憶するように構成された記憶部(120)と、情報処理装置(100)の動作を制御するように構成された制御部(110)と、タッチスクリーン(15)とを備える。本実施形態のある局面によると、制御部は、タッチスクリーン(15)に対する入力操作を受け付け、ゲーム空間における仮想カメラの配置を制御し、仮想カメラの撮影画像をタッチスクリーン(15)に表示させ、タッチスクリーン(15)に対するタッチ位置を第1の位置から第2の位置へ移動させることにより入力される第1の入力操作を受け付けた場合に、第1の位置と第2の位置とにより定まるタッチ操作の方向と、タッチスクリーン(15)に表示される画面においてゲームキャラクタ(C)が向いている方向とを比較し、比較結果に応じたアクションを、ゲームキャラクタ(C)に実行させる。 (Item 16) In the present embodiment, the information processing apparatus (100) has been described. The information processing device (100) includes a storage unit (120) configured to store a game program, a control unit (110) configured to control the operation of the information processing device (100), and a touch screen. (15). According to an aspect of the present embodiment, the control unit accepts an input operation on the touch screen (15), controls the placement of the virtual camera in the game space, displays the captured image of the virtual camera on the touch screen (15), A touch determined by the first position and the second position when a first input operation input by moving the touch position on the touch screen (15) from the first position to the second position is received. The direction of operation is compared with the direction in which the game character (C) is facing on the screen displayed on the touch screen (15), and the game character (C) is caused to perform an action according to the comparison result.

 〔付記事項2〕
 本発明の他の一側面に係る内容を列記すると以下のとおりである。
[Appendix 2]
The contents according to another aspect of the present invention are listed as follows.

 (項目1) プロセッサ(10)と、メモリ(11)と、タッチスクリーン(15)とを備えるコンピュータ(ユーザ端末100)により実行されるゲームプログラム(131)について説明した。本開示のある局面によると、ゲームプログラムに基づくゲームは、タッチスクリーンに対する操作を受け付けると、該操作に関連付けられたアクションを、ユーザが操作する操作キャラクタに実行させるゲームである。ゲームプログラムは、プロセッサに、タッチスクリーンに対する連続する操作よりなる操作列に含まれる各操作に、該操作の操作列における順序に応じたアクションを関連付ける編集画面を表示するステップ、を実行させる。 (Item 1) The game program (131) executed by the computer (user terminal 100) including the processor (10), the memory (11), and the touch screen (15) has been described. According to an aspect of the present disclosure, a game based on a game program is a game that causes an operation character operated by a user to execute an action associated with the operation when an operation on the touch screen is accepted. The game program causes the processor to execute a step of displaying an editing screen that associates each operation included in the operation sequence including continuous operations on the touch screen with an action according to the order in the operation sequence of the operation.

 前記の構成によれば、ユーザは、操作列に含まれる各操作に対して、該操作の操作列における順序に応じたアクションを関連付けることが可能である。そして、ユーザは、連続して操作を行った場合には、各操作の操作列における順序に応じて関連付けたアクションを、操作キャラクタに連続して実行させることができる。このため、ユーザは、連続するタッチ入力に応じたゲーム展開を、試行錯誤しながら変化させることができる。その結果、ユーザは、連続する操作に応じて多様なゲーム展開を楽しむことができ、ゲームの興趣性がより高まる。 According to the above configuration, the user can associate an action corresponding to the order of the operation sequence with respect to each operation included in the operation sequence. And when a user performs operation continuously, it can make an operation character perform continuously the action linked | related according to the order in the operation sequence of each operation. For this reason, the user can change the game development according to the continuous touch input with trial and error. As a result, the user can enjoy various game developments according to successive operations, and the fun of the game is further enhanced.

 (項目2) (項目1)において、操作列が、同種類の連続する操作を表すとき、第1の種類の操作を連続して受け付けると、当該第1の種類の各操作が受け付けられた順序に応じて、第1の種類の操作列における該当する順序の操作に対して編集画面で関連付けられたアクションを、操作キャラクタに実行させる第1のステップと、第1の種類の操作に続いて第2の種類の操作を受け付けると、第2の種類の操作列における先頭の操作に対して編集画面で関連付けられたアクションを、操作キャラクタに実行させる第2のステップと、をさらに実行させてもよい。これにより、ユーザは、複数種類の操作列を連続させて、多様なゲーム展開を楽しむことができる。 (Item 2) In (Item 1), when the operation sequence represents the same type of continuous operation, if the first type of operation is continuously received, the order in which the operations of the first type are received Accordingly, the first step of causing the operation character to execute the action associated with the operation in the corresponding order in the operation sequence of the first type on the editing screen, and the first type of operation following the first type of operation. When the two types of operations are received, a second step of causing the operation character to execute an action associated with the first operation in the second type operation sequence on the editing screen may be further executed. . Thereby, the user can enjoy a variety of game development by continuing a plurality of types of operation sequences.

(項目3) (項目2)において、操作の種類は、タップ操作、フリック操作及び回転操作の少なくともいずれかを含んでいてもよい。回転操作とは、タッチスクリーンに対するタッチ位置の軌跡がリング状又はほぼリング状となる操作である。これにより、ユーザは、連続する操作に応じた多様なゲーム展開を、これらの種類の少なくとも何れかの操作を用いて楽しむことができる。 (Item 3) In (Item 2), the type of operation may include at least one of a tap operation, a flick operation, and a rotation operation. The rotation operation is an operation in which the locus of the touch position with respect to the touch screen becomes a ring shape or a substantially ring shape. Thereby, the user can enjoy various game developments according to successive operations using at least one of these types of operations.

 (項目4) (項目2)から(項目3)の何れか1項目において、操作の種類は、操作の方向に基づく種類を含んでいてもよい。操作の方向とは、タッチスクリーンに対するタッチ位置の移動方向である。これにより、ユーザは、連続する操作に応じた多様なゲーム展開を、操作の方向に基づき変化させて楽しむことができる。 (Item 4) In any one item from (Item 2) to (Item 3), the type of operation may include a type based on the direction of the operation. The direction of operation is the direction of movement of the touch position with respect to the touch screen. Thereby, the user can enjoy various game developments according to successive operations by changing them based on the direction of the operations.

 (項目5) (項目2)から(項目4)の何れか1項目において、編集画面において、操作列に含まれる各操作に関連付け得るアクションは、関連付けられる操作の種類に応じた特性を有していてもよい。これにより、ユーザは、連続する操作を行う際に、各操作に関連付けたアクションをその種類に基づき容易に想起することができるので、操作性が向上する。 (Item 5) In any one item from (Item 2) to (Item 4), the action that can be associated with each operation included in the operation sequence on the edit screen has a characteristic according to the type of the associated operation. May be. Thereby, when a user performs continuous operation, since the action associated with each operation can be easily recalled based on the type, the operability is improved.

 (項目6) (項目1)から(項目5)の何れか1項目において、ゲームにおいて所定の条件が満たされると、編集画面において、操作列に含まれる各操作に関連付け得るアクションが追加されてもよい。これにより、関連付け得るアクションを増やすために、ゲームをプレイすることに対するユーザの動機付けが強化される。 (Item 6) In any one item of (Item 1) to (Item 5), when a predetermined condition is satisfied in the game, an action that can be associated with each operation included in the operation sequence is added on the edit screen. Good. This enhances the user's motivation for playing the game to increase the actions that can be associated.

 (項目7) (項目1)から(項目6)の何れか1項目において、編集画面を表示するステップは、操作キャラクタになり得るキャラクタとしてユーザが保有する保有キャラクタ毎に、操作列に含まれる各操作に対して該操作の操作列における順序に応じたアクションを関連付ける編集画面を表示してもよい。これにより、ユーザは、連続する操作に応じたゲーム展開を、保有キャラクタ毎に変化させて楽しむことができる。 (Item 7) In any one item from (Item 1) to (Item 6), the step of displaying the edit screen includes each operation included in the operation sequence for each possessed character held by the user as a character that can be an operation character. You may display the edit screen which associates the action according to the order in the operation row | line | column of this operation with respect to operation. Thereby, the user can enjoy the game development according to the continuous operation while changing the game for each possessed character.

 (項目8) (項目7)において、保有キャラクタに関する編集画面において、操作列に含まれる各操作に関連付け得るアクションは、ゲームにおいて当該保有キャラクタが取得済みのアクションを含んでいてもよい。これにより、ユーザは、操作キャラクタに、そのキャラクタの世界観に合ったアクションを実行させることができる。 (Item 8) In (Item 7), the action that can be associated with each operation included in the operation sequence on the editing screen related to the retained character may include an action that the retained character has already acquired in the game. Thus, the user can cause the operating character to execute an action that matches the character's view of the world.

 (項目9) (項目8)において、保有キャラクタに関する編集画面において、操作列に含まれる各操作に関連付け得るアクションは、ゲームにおいて当該保有キャラクタと同種の他の保有キャラクタが取得済みのアクションを含んでいてもよい。これにより、ユーザは、操作キャラクタに、そのキャラクタの種別の世界観に合った多様なアクションを実行させることができる。 (Item 9) In (Item 8), the actions that can be associated with each operation included in the operation sequence on the editing screen related to the possessed character include actions that have already been acquired by other retained characters of the same type as the retained character in the game. May be. Thus, the user can cause the operating character to execute various actions that match the world view of the character type.

 (項目10) (項目1)から(項目9)の何れか1項目において、操作列に含まれ得る操作数は、最大数以下であってもよい。これにより、ユーザは、1つの操作列により最大数までのアクションを操作キャラクタに連続して実行させることができる。また、(項目10)が(項目2)において適用される場合、ユーザは、最大数までの操作列を複数種類つなげることで、最大数を超えて連続するアクションを操作キャラクタに実行させることができるので、ゲームの興趣性が高まる。 (Item 10) In any one item of (Item 1) to (Item 9), the number of operations that can be included in the operation sequence may be equal to or less than the maximum number. As a result, the user can cause the operation character to continuously execute up to the maximum number of actions using one operation sequence. When (Item 10) is applied in (Item 2), the user can cause the operation character to execute an action that exceeds the maximum number by connecting a plurality of types of operation strings up to the maximum number. Therefore, the interest of the game increases.

 (項目11) (項目10)において、ゲームにおいて所定の条件が満たされると、最大数が増加してもよい。これにより、最大数を増加させるために、ゲームをプレイすることに対するユーザの動機づけが強化される。 (Item 11) In (Item 10), when a predetermined condition is satisfied in the game, the maximum number may be increased. This enhances the user's motivation for playing the game in order to increase the maximum number.

 (項目12) (項目10)または(項目11)において、編集画面において、最大数以下に設定された編集可能数までの操作に対して、アクションの関連付けが可能であってもよい。これにより、ユーザは、1つの操作列により編集可能数までのアクションを操作キャラクタに連続して実行させることができる。また、ユーザは、編集可能数を最大数まで増加させ得ると予想できるので、ゲームをプレイすることに対するユーザの動機づけが強化される。 (Item 12) In (Item 10) or (Item 11), actions may be associated with operations up to the editable number set to the maximum number or less on the edit screen. As a result, the user can cause the operation character to continuously execute actions up to the editable number with one operation sequence. Further, since the user can predict that the editable number can be increased to the maximum number, the user's motivation for playing the game is enhanced.

 (項目13) (項目12)において、ゲームにおいて所定の条件が満たされると、編集可能数が増加してもよい。これにより、編集可能数を増加させるために、ゲームをプレイすることに対するユーザの動機づけが強化される。 (Item 13) In (Item 12), when a predetermined condition is satisfied in the game, the editable number may increase. Thereby, in order to increase the editable number, the motivation of the user for playing the game is enhanced.

 (項目14) (項目1)から(項目13)の何れか1項目において、アクションにはコストが設定され、編集画面において編集可能な1つ以上の操作列に含まれる各操作に関連付けられたアクションに設定されたコストの総和が、所定のコスト上限値を超えない範囲で、操作に対するアクションの関連付けが可能であってもよい。これにより、連続する操作に応じて操作キャラクタに連続して実行させるアクションが、ゲームを展開する上で有利になり過ぎることによりゲームの興趣性が低下することを防止できる。 (Item 14) In any one item of (Item 1) to (Item 13), the cost is set for the action, and the action associated with each operation included in one or more operation sequences that can be edited on the edit screen It may be possible to associate actions with operations within a range in which the sum of the costs set in (1) does not exceed a predetermined cost upper limit value. Thereby, it is possible to prevent the interest of the game from deteriorating due to the fact that the action continuously executed by the operation character according to the continuous operation becomes too advantageous in developing the game.

 (項目15) (項目14)において、ゲームにおいて所定の条件が満たされると、コスト上限値が増加してもよい。これにより、ユーザは、コスト上限値を増加させるために、ゲームをプレイすることに対する動機づけが強化される。 In (Item 15) (Item 14), when a predetermined condition is satisfied in the game, the cost upper limit value may be increased. Thereby, in order to increase the cost upper limit value, the user is motivated to play the game.

 (項目16) (項目1)から(項目15)の何れか1項目において、編集画面を表示するステップは、操作列に含まれる各操作に対して他のユーザによってアクションが関連付けられた情報に基づいて、操作列に含まれる各操作に対してアクションを一括して関連付けてもよい。これにより、ユーザは、他のユーザによって操作列に含まれる各操作に対して関連付けられたアクションを用いて、連続する操作に応じたゲーム展開を楽しむことができる。 (Item 16) In any one item of (Item 1) to (Item 15), the step of displaying the edit screen is based on information in which an action is associated with each operation included in the operation sequence by another user. Thus, actions may be collectively associated with each operation included in the operation sequence. Thereby, the user can enjoy the game development according to continuous operation using the action associated with each operation included in the operation sequence by another user.

 (項目17) (項目1)から(項目16)の何れか1項目において、編集画面を表示するステップは、操作列に含まれる各操作に対して推奨されるアクションを一括して関連付けてもよい。これにより、ユーザは、操作列に含まれる各操作に対して推奨されるアクションを用いて、連続する操作に応じたゲーム展開を楽しむことができる。 (Item 17) In any one item from (Item 1) to (Item 16), the step of displaying the edit screen may associate the recommended actions for each operation included in the operation sequence in a lump. . Thereby, the user can enjoy the game development according to the continuous operation using the action recommended for each operation included in the operation sequence.

 (項目18) ゲームプログラムを実行する方法を説明した。本開示のある局面によると、ゲームプログラム(131)は、プロセッサ(10)と、メモリ(11)と、タッチスクリーン(15)とを備えるコンピュータにより実行されるものである、ゲームプログラムに基づくゲームは、タッチスクリーンに対する操作を受け付けると、該操作に関連付けられたアクションを、ユーザが操作する操作キャラクタに実行させるゲームである。方法は、プロセッサが、タッチスクリーンに対する連続する操作よりなる操作列に含まれる各操作に、該操作の操作列における順序に応じたアクションを関連付ける編集画面を表示する。(項目17)に係る方法は、(項目1)に係るゲームプログラムと同様の作用効果を奏する。 (Item 18) Explained how to execute a game program. According to an aspect of the present disclosure, the game program (131) is executed by a computer including a processor (10), a memory (11), and a touch screen (15). When the operation on the touch screen is accepted, the operation character operated by the user is caused to execute an action associated with the operation. In the method, the processor displays an editing screen in which each operation included in an operation sequence including continuous operations on the touch screen is associated with an action according to the order in the operation sequence of the operation. The method according to (Item 17) has the same effects as the game program according to (Item 1).

 (項目19) 情報処理装置(ユーザ端末100)について説明した。本開示のある局面によると、情報処理装置は、ゲームプログラム(131)を記憶する記憶部(120)と、ゲームプログラムを実行することにより、情報処理装置の動作を制御する制御部(110)と、タッチスクリーン(15)と、を備える。ゲームプログラムに基づくゲームは、タッチスクリーンに対する操作を受け付けると、該操作に関連付けられたアクションを、ユーザが操作する操作キャラクタに実行させるゲームである。制御部は、タッチスクリーンに対する連続する操作よりなる操作列に含まれる各操作に、該操作の操作列における順序に応じたアクションを関連付ける編集画面を表示する。(項目18)に係る情報処理装置は、(項目1)に係るゲームプログラムと同様の作用効果を奏する。 (Item 19) The information processing apparatus (user terminal 100) has been described. According to an aspect of the present disclosure, the information processing device includes a storage unit (120) that stores the game program (131), and a control unit (110) that controls the operation of the information processing device by executing the game program. And a touch screen (15). A game based on a game program is a game that causes an operation character operated by a user to execute an action associated with the operation when an operation on the touch screen is accepted. The control unit displays an editing screen that associates each operation included in the operation sequence including continuous operations on the touch screen with an action according to the order of the operation sequence of the operation. The information processing apparatus according to (Item 18) has the same effects as the game program according to (Item 1).

1 ゲームシステム、2 ネットワーク、10,20 プロセッサ、11,21 メモリ、12,22 ストレージ、13,23 通信IF、14,24 入出力IF、15 タッチスクリーン、17 カメラ、18 測距センサ、100 ユーザ端末(情報処理装置、クライアントのコンピュータ)、110,210 制御部、111-1 操作受付部、112-1 表示制御部、113-1 UI制御部、114-1 アニメーション生成部、115 ゲーム実行部、116 カメラ配置制御部、117 編集画面生成部、118 連続操作判定部、120 記憶部、131 ゲームプログラム、132 ゲーム情報、133 ユーザ情報、151 入力部、152 表示部、200 サーバ(情報処理装置、サーバのコンピュータ)、1010 物体、1020 コントローラ、1030 記憶媒体
 
1 game system, 2 networks, 10,20 processor, 11,21 memory, 12,22 storage, 13,23 communication IF, 14,24 input / output IF, 15 touch screen, 17 camera, 18 distance sensor, 100 user terminal (Information processing device, client computer), 110, 210 control unit, 111-1 operation reception unit, 112-1 display control unit, 113-1 UI control unit, 114-1 animation generation unit, 115 game execution unit, 116 Camera arrangement control unit, 117 edit screen generation unit, 118 continuous operation determination unit, 120 storage unit, 131 game program, 132 game information, 133 user information, 151 input unit, 152 display unit, 200 server (information processing apparatus, server Computer) 1010 object 1020 controller 1030 storage medium

Claims (19)

 ゲームプログラムであって、
 前記ゲームプログラムは、プロセッサと、メモリと、タッチスクリーンとを備えるコンピュータにより実行されるものであり、
 前記ゲームプログラムに基づくゲームは、前記タッチスクリーンに対する操作を受け付けると、該操作に関連付けられたアクションを、ユーザが操作する操作キャラクタに実行させるゲームであり、
 前記ゲームプログラムは、前記プロセッサに、
  前記タッチスクリーンに対する連続する操作よりなる操作列に含まれる各操作に、該操作の前記操作列における順序に応じたアクションを関連付ける編集画面を表示するステップ、を実行させるゲームプログラム。
A game program,
The game program is executed by a computer including a processor, a memory, and a touch screen,
The game based on the game program is a game that, when receiving an operation on the touch screen, causes an operation character operated by the user to execute an action associated with the operation,
The game program is stored in the processor.
A game program that executes a step of displaying an editing screen that associates each operation included in an operation sequence including continuous operations on the touch screen with an action according to an order of the operation in the operation sequence.
 前記操作列が、同種類の連続する操作を表すとき、
 第1の種類の操作を連続して受け付けると、当該第1の種類の各操作が受け付けられた順序に応じて、前記第1の種類の操作列における該当する順序の操作に対して前記編集画面で関連付けられたアクションを、前記操作キャラクタに実行させる第1のステップと、
 前記第1の種類の操作に続いて第2の種類の操作を受け付けると、前記第2の種類の操作列における先頭の操作に対して前記編集画面で関連付けられたアクションを、前記操作キャラクタに実行させる第2のステップと、をさらに実行させる、請求項1に記載のゲームプログラム。
When the operation sequence represents the same type of continuous operation,
When the first type of operation is continuously received, the editing screen is displayed for the operations in the corresponding order in the operation sequence of the first type according to the order in which the operations of the first type are received. A first step of causing the operation character to execute the action associated with
When a second type of operation is received following the first type of operation, an action associated with the first operation in the second type of operation sequence is executed on the operation character with respect to the editing screen. The game program according to claim 1, further causing the second step to be executed.
 前記操作の種類は、タップ操作、フリック操作及び回転操作の少なくともいずれかを含み、前記回転操作は、前記タッチスクリーンに対するタッチ位置の軌跡がリング状又はほぼリング状となる操作である、請求項2に記載のゲームプログラム。 The type of the operation includes at least one of a tap operation, a flick operation, and a rotation operation, and the rotation operation is an operation in which a locus of a touch position with respect to the touch screen is a ring shape or a substantially ring shape. The game program described in.  前記操作の種類は、操作の方向に基づく種類を含み、前記操作の方向は、前記タッチスクリーンに対するタッチ位置の移動方向である、請求項2または3に記載のゲームプログラム。 4. The game program according to claim 2, wherein the type of operation includes a type based on a direction of the operation, and the direction of the operation is a moving direction of a touch position with respect to the touch screen.  前記編集画面において、前記操作列に含まれる各操作に関連付け得るアクションは、関連付けられる操作の種類に応じた特性を有する、請求項2から4の何れか1項に記載のゲームプログラム。 The game program according to any one of claims 2 to 4, wherein an action that can be associated with each operation included in the operation sequence on the editing screen has a characteristic according to a type of the associated operation.  前記ゲームにおいて所定の条件が満たされると、前記編集画面において、前記操作列に含まれる各操作に関連付け得るアクションが追加される、請求項1から5の何れか1項に記載のゲームプログラム。 6. The game program according to claim 1, wherein when a predetermined condition is satisfied in the game, an action that can be associated with each operation included in the operation sequence is added on the editing screen.  前記編集画面を表示するステップは、前記操作キャラクタになり得るキャラクタとしてユーザが保有する保有キャラクタ毎に、前記操作列に含まれる各操作に対して該操作の前記操作列における順序に応じたアクションを関連付ける前記編集画面を表示する、請求項1から6の何れか1項に記載のゲームプログラム。 The step of displaying the editing screen includes an action corresponding to the order of the operations in the operation sequence for each operation included in the operation sequence for each possessed character held by the user as a character that can be the operation character. The game program according to any one of claims 1 to 6, wherein the editing screen to be associated is displayed.  前記保有キャラクタに関する前記編集画面において、前記操作列に含まれる各操作に関連付け得るアクションは、前記ゲームにおいて当該保有キャラクタが取得済みのアクションを含む、請求項7に記載のゲームプログラム。 The game program according to claim 7, wherein in the editing screen related to the possessed character, an action that can be associated with each operation included in the operation sequence includes an action that the retained character has already been acquired in the game.  前記保有キャラクタに関する前記編集画面において、前記操作列に含まれる各操作に関連付け得るアクションは、前記ゲームにおいて当該保有キャラクタと同種の他の保有キャラクタが取得済みのアクションを含む、請求項8に記載のゲームプログラム。 The action that can be associated with each operation included in the operation sequence on the editing screen related to the held character includes an action that has already been acquired by another held character of the same type as the held character in the game. Game program.  前記操作列に含まれ得る操作数は、所定の最大数以下である、請求項1から9の何れか1項に記載のゲームプログラム。 The game program according to any one of claims 1 to 9, wherein the number of operations that can be included in the operation sequence is equal to or less than a predetermined maximum number.  前記ゲームにおいて所定の条件が満たされると、前記最大数が増加する、請求項10に記載のゲームプログラム。 The game program according to claim 10, wherein the maximum number increases when a predetermined condition is satisfied in the game.  前記編集画面において、前記最大数以下に設定された編集可能数までの操作に対して、前記アクションの関連付けが可能である、請求項10または11に記載のゲームプログラム。 The game program according to claim 10 or 11, wherein the action can be associated with operations up to the editable number set to the maximum number or less on the edit screen.  前記ゲームにおいて所定の条件が満たされると、前記編集可能数が増加する、請求項12に記載のゲームプログラム。 The game program according to claim 12, wherein the editable number increases when a predetermined condition is satisfied in the game.  前記アクションにはコストが設定され、前記編集画面において編集可能な1つ以上の前記操作列に含まれる各操作に関連付けられたアクションに設定された前記コストの総和が、所定のコスト上限値を超えない範囲で、前記操作に対する前記アクションの関連付けが可能である、請求項1から13の何れか1項に記載のゲームプログラム。 A cost is set for the action, and the sum of the costs set for the action associated with each operation included in the one or more operation columns editable on the edit screen exceeds a predetermined cost upper limit value. The game program according to any one of claims 1 to 13, wherein the action can be associated with the operation in a range that does not exist.  前記ゲームにおいて所定の条件が満たされると、前記コスト上限値が増加する、請求項14に記載のゲームプログラム。 15. The game program according to claim 14, wherein when a predetermined condition is satisfied in the game, the cost upper limit value increases.  前記編集画面を表示するステップは、前記操作列に含まれる各操作に対して他のユーザによって前記アクションが関連付けられた情報に基づいて、前記操作列に含まれる各操作に対して前記アクションを一括して関連付ける、請求項1から15の何れか1項に記載のゲームプログラム。 The step of displaying the editing screen includes: batching the actions for each operation included in the operation sequence based on information associated with the actions for each operation included in the operation sequence by another user. The game program according to claim 1, wherein the game program is associated with each other.  前記編集画面を表示するステップは、前記操作列に含まれる各操作に対して推奨されるアクションを一括して関連付ける、請求項1から16の何れか1項に記載のゲームプログラム。 The game program according to any one of claims 1 to 16, wherein in the step of displaying the editing screen, recommended actions are collectively associated with each operation included in the operation sequence.  ゲームプログラムを実行する方法であって、
 前記ゲームプログラムは、プロセッサと、メモリと、タッチスクリーンとを備えるコンピュータにより実行されるものであり、
 前記ゲームプログラムに基づくゲームは、前記タッチスクリーンに対する操作を受け付けると、該操作に関連付けられたアクションを、ユーザが操作する操作キャラクタに実行させるゲームであり、
 前記方法は、前記プロセッサが、
  前記タッチスクリーンに対する連続する操作よりなる操作列に含まれる各操作に、該操作の前記操作列における順序に応じたアクションを関連付ける編集画面を表示するステップ、を実行する方法。
A method for executing a game program,
The game program is executed by a computer including a processor, a memory, and a touch screen,
The game based on the game program is a game that, when receiving an operation on the touch screen, causes an operation character operated by the user to execute an action associated with the operation,
In the method, the processor comprises:
A step of displaying an edit screen in which each operation included in an operation sequence including continuous operations on the touch screen is associated with an action corresponding to an order of the operation in the operation sequence.
 情報処理装置であって、
 前記情報処理装置は、
  ゲームプログラムを記憶する記憶部と、
  前記ゲームプログラムを実行することにより、前記情報処理装置の動作を制御する制御部と、
  タッチスクリーンと、を備え、
 前記ゲームプログラムに基づくゲームは、前記タッチスクリーンに対する操作を受け付けると、該操作に関連付けられたアクションを、ユーザが操作する操作キャラクタに実行
させるゲームであり、
 前記制御部は、
  前記タッチスクリーンに対する連続する操作よりなる操作列に含まれる各操作に、該操作の前記操作列における順序に応じたアクションを関連付ける編集画面を表示する、情報処理装置。
An information processing apparatus,
The information processing apparatus includes:
A storage unit for storing game programs;
A control unit that controls the operation of the information processing apparatus by executing the game program;
A touch screen,
The game based on the game program is a game that, when receiving an operation on the touch screen, causes an operation character operated by the user to execute an action associated with the operation,
The controller is
An information processing apparatus that displays an editing screen that associates each operation included in an operation sequence including continuous operations on the touch screen with an action according to an order of the operation in the operation sequence.
PCT/JP2017/031526 2016-09-01 2017-09-01 Game program, method, and information processing device Ceased WO2018043693A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2016-171246 2016-09-01
JP2016171246A JP6190505B1 (en) 2016-09-01 2016-09-01 GAME PROGRAM, RECORDING MEDIUM, METHOD, AND INFORMATION PROCESSING DEVICE
JP2017-158755 2017-08-21
JP2017158755A JP2019034075A (en) 2017-08-21 2017-08-21 Game program, method and information processor

Publications (1)

Publication Number Publication Date
WO2018043693A1 true WO2018043693A1 (en) 2018-03-08

Family

ID=61309391

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/031526 Ceased WO2018043693A1 (en) 2016-09-01 2017-09-01 Game program, method, and information processing device

Country Status (1)

Country Link
WO (1) WO2018043693A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020044134A (en) * 2018-09-19 2020-03-26 株式会社コロプラ Game program, method, and information processing device
JP2020044152A (en) * 2018-09-20 2020-03-26 株式会社カプコン Game program and game system
CN115138069A (en) * 2022-08-12 2022-10-04 竞技世界(北京)网络技术有限公司 Data processing method and device

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6180610B1 (en) * 2016-11-01 2017-08-16 株式会社コロプラ GAME METHOD AND GAME PROGRAM

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6180610B1 (en) * 2016-11-01 2017-08-16 株式会社コロプラ GAME METHOD AND GAME PROGRAM

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
"Shinjigen Game Neptune VII", YOYAKU TOKUTEN & GENTEIBAN JOHO, KYARA YA SYSTEM, SHUDAIKA JOHO MO KOKAI, 8 January 2015 (2015-01-08), Retrieved from the Internet <URL:http://gamestalk.net/neptune-vii-8> [retrieved on 20171106] *
AZUREN: "Combo no Tsukaikata o Kaisetsu! Hitsuyo na Kunsho to Tokushu Koka Ichiran", COMBO LEVEL NO AGEKATA, 11 October 2015 (2015-10-11), Retrieved from the Internet <URL:http://gameleaks.jp/toram-online/2423> [retrieved on 20171106] *
QUEENS GATE SPIRAL CHAOS, SHUKAN FAMI TSU, vol. 26, no. 35, 4 August 2011 (2011-08-04) *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020044134A (en) * 2018-09-19 2020-03-26 株式会社コロプラ Game program, method, and information processing device
JP2020044152A (en) * 2018-09-20 2020-03-26 株式会社カプコン Game program and game system
CN115138069A (en) * 2022-08-12 2022-10-04 竞技世界(北京)网络技术有限公司 Data processing method and device

Similar Documents

Publication Publication Date Title
JP6370417B2 (en) Game program, method for executing game program, and information processing apparatus
JP6472555B1 (en) GAME PROGRAM, METHOD, AND INFORMATION PROCESSING DEVICE
WO2016023971A1 (en) Selecting objects on a user interface
JP7049968B2 (en) Game programs, methods, and information processing equipment
JP6514376B1 (en) Game program, method, and information processing apparatus
JP2020022574A (en) Game program, method, and information processing device
WO2018043693A1 (en) Game program, method, and information processing device
JP2020044154A (en) Game program, method, and information processing device
JP2019195418A (en) Game program, method, and information processing device
JP7346055B2 (en) game program
JP6416419B1 (en) GAME PROGRAM, METHOD, AND INFORMATION PROCESSING DEVICE
JP6661595B2 (en) Game program, method and information processing device
US20220355189A1 (en) Game program, game method, and information processing device
JP7161977B2 (en) Program, method, and information processor
JP7672474B2 (en) program
JP6668425B2 (en) Game program, method, and information processing device
JP2019034075A (en) Game program, method and information processor
JP2020000735A (en) Program, method, and information processing device
KR20240158302A (en) Method and device for controlling virtual objects, and devices and media
JP2020043926A (en) Game program, method, and information processing device
JP7272799B2 (en) Game program, method, and information processing device
JP7252915B2 (en) Game program, method, and information processing device
JP2019195419A (en) Game program, method, and information processing device
JP7368553B2 (en) Game program, method, and information processing device
JP2019034216A (en) Game program, method and information processor

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17846696

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17846696

Country of ref document: EP

Kind code of ref document: A1