Detailed Description
The technical solutions of the embodiments of the present application will be clearly described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which are obtained by a person skilled in the art based on the embodiments of the present application, fall within the scope of protection of the present application.
The terms first, second and the like in the description and in the claims, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or otherwise described herein, and that the objects identified by "first," "second," etc. are generally of a type not limited to the number of objects, for example, the first object may be one or more. Furthermore, in the description and claims, "and/or" means at least one of the connected objects, and the character "/", generally means that the associated object is an "or" relationship.
The terms "at least one," "at least one," and the like in the description and in the claims, mean that they encompass any one, any two, or a combination of two or more of the objects. For example, at least one of a, b, c (item) may represent "a", "b", "c", "a and b", "a and c", "b and c" and "a, b and c", wherein a, b, c may be single or plural. Similarly, the term "at least two" means two or more, and the meaning of the expression is similar to the term "at least one".
The method, the device, the electronic equipment and the storage medium for sharing the object provided by the embodiment of the application are described in detail through specific embodiments and application scenes thereof with reference to the accompanying drawings.
The object sharing method provided by the embodiment of the application can be applied to application object sharing scenes, such as picture sharing, document sharing or video sharing.
The picture sharing scenario in the prior art is exemplified by picture sharing, and is specifically described in scenario 1.
Scene 1, under the condition that the electronic equipment displays a chat interface with the reddish, if a user needs to send a picture comprising an elephant and a hot air balloon to the reddish, and the picture is not found in the picture searching interface in the chat application of the electronic equipment, the user needs to find the picture from the album application and share the picture in the chat interface of the chat application. However, when the picture is not in the picture searching interface in the chat application of the electronic device, the electronic device needs to exit the chat application and open the album application, so that the user can search the required picture in the album application and then share the picture, which results in a complicated and time-consuming picture sharing process of the electronic device.
Illustratively, the conversion of a document into a picture sharing scenario in the prior art is described as a document conversion into a picture sharing scenario, and the scenario is specifically described as scenario 2.
Scene 2, under the condition that the electronic device displays a chat interface with a reddish color, if a user needs to send a picture corresponding to a document including the artificial intelligence (ARTIFICIAL INTELLIGENCE, AI) to the reddish color, the electronic device needs to exit the chat application, open the document application and display the document interface, so that the user can search the document including the artificial intelligence required by the user in the document interface, trigger the electronic device to open the document, then control the electronic device to capture the document content of the document to obtain a document capture, and store the document capture in an album application program. The user may then control the electronic device to display an image selection interface of the chat application that includes the document screenshot, such that the electronic device may share the document screenshot to a reddish color based on the user input of the document screenshot. However, after the electronic device finds the document required by the user, the user needs to manually capture and share the document, which results in a complicated and time-consuming process of sharing the picture by the electronic device.
In the object sharing method provided by the embodiment of the application, since one selection image is associated with one application program, the electronic device can directly call the application program corresponding to the selection image through the selection image displayed in the image selection interface, so that the electronic device can directly import the object selected by a user in the application program into the first application and directly share the object, that is, when the image selection interface of the first application does not comprise the object required to be shared by the user, the object required by the user in other application programs is not required to be stored in the album application program, then the object is imported into the image selection interface through the album application program, and the object is shared, so that the step of searching the object is simplified, and the efficiency of sharing the object is further improved.
The execution subject of the object sharing method provided by the embodiment of the application can be an object sharing device, and the object sharing device can be an electronic device or a functional module in the electronic device. The technical solution provided by the embodiment of the present application is described below by taking an electronic device as an example.
The embodiment of the application provides an object sharing method, and fig. 1 shows a flowchart of the object sharing method provided by the embodiment of the application. As shown in fig. 1, the method for sharing objects provided in the embodiment of the present application may include the following steps 201 to 204.
Step 201, the electronic device receives a first input to a first control in a first application.
In the embodiment of the application, the first control is used for searching the image.
In the embodiment of the application, the first input user triggers the electronic device to display the image selection interface corresponding to the first control.
Optionally, in the embodiment of the present application, the electronic device may display the image selection interface by means of gesture zooming, dragging, or an atom island.
Optionally, in the embodiment of the present application, the first input may be a click input of the user to the first control by a touch device such as a finger or a stylus, or a voice command input by the user, or a specific gesture input by the user, or other feasibility inputs, which may be specifically determined according to actual use requirements, and the embodiment of the present application is not limited.
Optionally, in an embodiment of the present application, the specific gesture may be any one of a single-click gesture, a swipe gesture, a drag gesture, a pressure recognition gesture, a long-press gesture, an area change gesture, a double-press gesture, and a double-click gesture.
Optionally, in the embodiment of the present application, the click input may be a single click input, a double click input, or any number of click inputs, and may also be a long press input or a short press input.
Illustratively, the first input may be a single click input of the first control by a user.
Optionally, in the embodiment of the present application, the first application may be any application program with a communication function in the electronic device, for example, an instant chat application program, a short message application program, a takeaway application program, a shopping application program, and the like.
Optionally, in the embodiment of the present application, the electronic device may display the first control in a session function area in a session interface of the first application.
The session interface may be a chat interface between the user and other users in the case where the first application is an instant chat application or a sms application, a chat interface between the user and a takeaway rider or a merchant in the case where the first application is a takeaway application, and a chat interface between the user and a merchant in the case where the first application is a shopping application.
Optionally, in an embodiment of the present application, the display form of the first control may include at least one of image display, text display, highlighting, color display, and the like.
By way of example, in combination with the scene 1, as shown in (a) of fig. 2, taking an electronic device as an example, in the case that the mobile phone displays a chat interface 10 with a reddish color, the chat interface 10 includes a conversation function identifier 11, a voice input control 12 and a text input control 13, and a user can perform click input on the conversation function identifier 11, as shown in (B) of fig. 2, so that the electronic device can display a conversation function area 14, where the conversation function area 14 includes an album control 15, that is, the first control, a shooting control 16, a business card control 17 and a position control 18.
Step 202, the electronic device responds to a first input, and at least one selected image is displayed on an image selection interface corresponding to a first application.
In the embodiment of the application, one selected image is associated with one application program.
It will be appreciated that the application associated with each selected image is different.
It should be noted that, the selection image in the present application refers to a control for occupying an area of the image selection interface.
Optionally, in the embodiment of the present application, the number of the selected images may be user-defined or preset by the electronic device. Specifically, the method can be determined according to actual use requirements, and the embodiment of the application is not limited.
Optionally, in the embodiment of the present application, the image style of the selected image may be user-defined or preset by the electronic device. Specifically, the method can be determined according to actual use requirements, and the embodiment of the application is not limited.
Optionally, in an embodiment of the present application, the at least one selected image may be preferentially displayed in the image selection interface.
Illustratively, the above-described preferential display refers to at least one selected image being always arranged at the forefront of all images in the image selection interface.
Optionally, in an embodiment of the present application, the image selection interface may include at least one image, where the at least one image may be an image stored in an album application of the electronic device.
Optionally, in an embodiment of the present application, the image selection interface may further include a local folder in the electronic device.
Optionally, in an embodiment of the present application, the electronic device may display at least one selection image on the image selection interface through a popup window. The method can be specifically determined according to actual use requirements, and the embodiment of the application is not limited.
Optionally, in the embodiment of the present application, the electronic device may display a prompt message to prompt the user to search for the required image by selecting the image.
The electronic device may display the prompt information through a pop-up window or in a blank area of the image selection interface, for example.
The blank area refers to an area of the image selection interface that does not include the control and the image.
For example, the prompt information may be "when there is no picture required in the album, the selection image search may be clicked".
Optionally, in the embodiment of the present application, after the electronic device displays the prompt information, the at least one selected image may be marked in the first manner.
Illustratively, the first manner described above may include at least one of highlighting, color marking, dashed marking, dithering marking, and the like. Specifically, the method can be determined according to actual use requirements, and the embodiment of the application is not limited.
Illustratively, in combination with scenario 1 and (B) in fig. 2, as shown in (a) in fig. 3, the user may make a click input to the album control 15, as shown in (B) in fig. 3, so that the mobile phone may display an image selection interface 20, the image selection interface 20 including 3 selection images, respectively, selection image 21, selection image 22, selection image 23, and 5 normal images.
Step 203, the electronic device receives a second input.
In an embodiment of the present application, the second input is used for selecting a target selection image from at least one selection image.
Alternatively, in an embodiment of the present application, the second input may include a plurality of sub-inputs. The following embodiments are specifically described below, and are not repeated here.
In step 204, the electronic device responds to the second input to share the object content of at least one object in the second application associated with the target selection image.
In the embodiment of the application, the object comprises at least one of a document, a link, a picture, video and audio.
It will be appreciated that since the selection image is essentially a control, the application associated with the selection image may be run through a second input to the selection image.
Optionally, in the embodiment of the present application, the electronic device may share the object content of at least one object in the second application through the picture.
It should be noted that the specific implementation process of the step 204 may be described in the following embodiments, and in order to avoid repetition, the description is omitted here.
The method for sharing the object comprises the steps of enabling electronic equipment to receive first input of a first control in a first application, then responding to the first input, displaying at least one selection image on an image selection interface corresponding to the first application, enabling one selection image to be associated with one application program, then receiving second input, enabling the second input to be used for selecting a target selection image in the at least one selection image, and finally responding to the second input, sharing object content of at least one object in the second application associated with the target selection image, wherein the object comprises at least one of documents, links, pictures, videos and audios. In the scheme, as one selection image is associated with one application program, the electronic device can directly call the application program corresponding to the selection image through the selection image displayed in the image selection interface, so that the electronic device can directly import the object selected by a user in the application program into the first application and directly share the object, that is, when the image selection interface of the first application does not comprise the object required to be shared by the user, the object required by the user in other application programs is not required to be stored in the album application program, then the object is imported into the image selection interface through the album application program, and the object is shared, so that the step of searching the object by the electronic device is simplified, and the object sharing efficiency of the electronic device is further improved.
Optionally, in an embodiment of the present application, the second input includes a first sub-input and a second sub-input.
In an embodiment of the present application, the first sub-input is used for determining a target selection image from at least one selection image.
Optionally, in the embodiment of the present application, the first sub-input may be a click input of the user on the target selection image through a touch device such as a finger or a stylus, or a voice command input by the user, or a specific gesture input by the user, or other feasibility inputs, which may be specifically determined according to actual use requirements, and the embodiment of the present application is not limited.
The first sub-input may be, for example, a single click input of the user to select an image for the target.
In an embodiment of the present application, the second sub-input is used to determine the first object identifier from at least one object identifier.
Optionally, in the embodiment of the present application, the second sub-input may be a click input of the user on the target selection image through a touch device such as a finger or a stylus, or a voice command input by the user, or a specific gesture input by the user, or other feasibility inputs, which may be specifically determined according to actual use requirements, and the embodiment of the present application is not limited.
The second sub-input may be, for example, a single click input of the first object identification by the user.
Illustratively, the step 204 may be implemented by the following steps 204a and 204 b.
Step 204a, the electronic device displays a file management interface of a second application associated with the target selection image in response to the first sub-input of the target selection image.
In the embodiment of the present application, the file management interface includes at least one object identifier, where one object identifier corresponds to one object.
Alternatively, in the embodiment of the present application, the second application may be an application different from the first application.
The second application may be any of an album application, a browser application, and a shopping application, for example.
Optionally, in the embodiment of the present application, the electronic device may display the file management interface of the second application on the image selection interface of the first application in the second manner.
Alternatively, in the embodiment of the present application, the second manner may be any one of popup, page overlay, or atomic component. Specifically, the method can be determined according to actual use requirements, and the embodiment of the application is not limited.
Optionally, in the embodiment of the present application, the file management interface may be an interface corresponding to a picture selector in the second application when the second application includes the picture selector, and the file management interface may be an application interface of the second application when the second application does not include the picture selector.
It should be noted that, the above-mentioned picture selector refers to an interface control or a functional module that allows a user to select one or more pictures from the electronic device or the cloud server. Which is a common interaction component in applications or operating systems.
For example, in the case where the second application is an album application, the file management interface may be an interface for displaying all images in the album application. In the case where the second application is a browser application, the file management interface may be any application interface in the browser application.
Optionally, in the embodiment of the application, the object identifier may be any one of an image identifier, a text identifier, an expression identifier, a special symbol identifier, or the like. Specifically, the method can be determined according to actual use requirements, and the embodiment of the application is not limited.
Preferably, the object identifier may be an image identifier.
Alternatively, in the case where the second application is an album application, the electronic device may directly display the album application file management interface.
Illustratively, in conjunction with fig. 3 (B), as shown in fig. 4 (a), the user may click on the selection image 21, and the second application associated with the selection image 21 is an album application, as shown in fig. 4 (B), so that the mobile phone may display the file management interface 24 of the album application in the image selection interface 20 through a popup window, where the file management interface 24 includes a search control 25, a search input box 26, a most recently used picture, a system album, an open control 27 corresponding to the system album, a shopping application album, and an open control 28 corresponding to the shopping application album.
Optionally, in the embodiment of the application, the electronic device can search the required image in the album application through a search control in the album application, or the electronic device can search the required image in the system album through an opening control corresponding to the system album, or the electronic device can search the required image in the application A album through an opening control corresponding to the shopping application album.
Illustratively, in connection with (B) of fig. 4, as shown in (a) of fig. 5, the user may make a click input to the open control 28 corresponding to the shopping application album, as shown in (B) of fig. 5, so that the mobile phone may display the image interface 29 corresponding to the shopping application album through a pop-up window in the image selection interface 20.
Optionally, in the embodiment of the present application, in the case where the second application is a shopping application, the electronic device may directly display a file management interface corresponding to the shopping application.
Illustratively, in connection with (B) of fig. 3, as shown in fig. 6, the user may make a click input on the selection image 22, and select the image 22 to associate with the shopping application, as shown in (B) of fig. 5, so that the mobile phone may display the image interface 29 corresponding to the shopping application album through a pop-up window in the image selection interface 20.
Optionally, in the embodiment of the present application, in the case where the second application is a browser application, the electronic device may directly display an application interface corresponding to the browser application.
Illustratively, in connection with (B) of fig. 3, as shown in (a) of fig. 7, the user may make a click input to select image 23, select image 23 associated with the browser application, as shown in (B) of fig. 7, so that the cell phone may display search interface 30 of the browser application in image selection interface 20 through a popup window, the search interface 30 including image 31 and comment information "murmur is common, and berle is not common".
It should be noted that, the application associated with the selection image is merely an exemplary illustration, and the application associated with the selection image according to the embodiment of the present application includes, but is not limited to, the above example.
In step 204b, the electronic device shares the object content of the first object corresponding to the first object identifier in response to the second sub-input of the first object identifier in the at least one object identifier.
In the embodiment of the application, under the condition that the first object corresponding to the first object identifier is an image, the electronic device can import the first object corresponding to the first object identifier into the first application according to the second sub-input of the first object identifier in the at least one object identifier, and directly share the object content of the first object to other contacts through the session interface.
Alternatively, in an embodiment of the present application, the first object may be one or more objects.
Optionally, in the embodiment of the application, the electronic device may share the picture of the object content including the first object, or the electronic device may directly share the object content of the first object.
The object content may be an image when the first object is an image, a thumbnail of a video corresponding to the video link when the first object is a video link, a picture of audio content including audio when the first object is audio, a brief description of a video scenario including video when the first object is video, and a picture of document content including a document when the first object is a document.
Optionally, in the embodiment of the present application, a third control may be included in the image selection interface, and after the user selects the first object identifier, the user may input the third control to share the object content of the first object corresponding to the first object identifier.
Illustratively, in conjunction with fig. 4 (B), as shown in fig. 8 (a), the user may make a click input on a picture 32 of the most recently used pictures in the file management interface 24 of the album application, then, as shown in fig. 8 (B), the mobile phone may display a "complete" control 33 in the image selection interface 20 and shake-mark the picture 32, and the user may make a click input on the "complete" control 33, i.e., the third control described above, as shown in fig. 8 (C), the mobile phone may display the picture 32 in the conversation area 34 of the chat interface 10, thereby sharing the picture 32 to the contact reddish.
Optionally, in an embodiment of the present application, an object candidate area may be included in the image selection interface, where the object candidate area includes the second control, and the object candidate area is used to temporarily store the first object.
Illustratively, in conjunction with (B) in fig. 7, as shown in (a) in fig. 9, after the mobile phone displays the search interface 30 of the browser application, an object candidate area 35 and a "finish" control 33 may be displayed in the image selection interface 20, and the user may drag the image 31 in the search interface 30 of the browser application to the object candidate area 35, as shown in (B) in fig. 9, so that the mobile phone may add a thumbnail of the image 31 in the object candidate area 35. The user may then make a click input to the "done" control 33, shown in fig. 9 (C), to cause the cell phone to display the image 31 in the conversation area 34 of the chat interface 10, thereby sharing the image 31 to the contact reddish.
According to the embodiment of the application, the electronic equipment can display at least one object identifier in the second application in the image selection interface according to the first sub-input of the target selection image by the user, and perform the second sub-input on the first object identifier in the at least one object identifier, so that the electronic equipment can acquire the object content of the first object in the second application and directly share the object content under the condition that application switching is not needed, and the object sharing efficiency and flexibility of the electronic equipment are improved.
Optionally, in the embodiment of the present application, the first object corresponding to the first object identifier is a document, and the second input includes a third sub-input, a fourth sub-input and a fifth sub-input.
In the embodiment of the present application, the third sub-input is used to trigger the electronic device to display the file management interface of the second application associated with the target selection image.
Optionally, in the embodiment of the present application, the third sub-input may be a click input of the user to the target selection image through a touch device such as a finger or a stylus, or a voice command input by the user, or a specific gesture input by the user, or other feasibility inputs, which may be specifically determined according to actual use requirements, and the embodiment of the present application is not limited.
The third sub-input may be, for example, a single click input of the user to select an image for the target.
In the embodiment of the application, the fourth sub-input may be used to trigger the electronic device to display a document preview interface of the document.
Optionally, in the embodiment of the present application, the fourth sub-input may be a click input of the user to the first object identifier by a touch device such as a finger or a stylus, or a voice command input by the user, or a specific gesture input by the user, or other feasibility inputs, which may be specifically determined according to actual use requirements, and the embodiment of the present application is not limited.
The third sub-input may be, for example, a single click input of the first object identification by the user.
In the embodiment of the application, the fifth sub-input is used for triggering the electronic equipment to acquire the document screenshot of the document.
Optionally, in the embodiment of the present application, the fifth sub-input may be a click input of the content sharing control by the user through a touch device such as a finger or a stylus, or a voice command input by the user, or a specific gesture input by the user, or other feasibility inputs, which may be specifically determined according to actual use requirements, and the embodiment of the present application is not limited.
Illustratively, the fifth sub-input may be a single click input of the content sharing control by the user.
The above step 204 may be implemented by the following steps 301 to 303.
Step 301, the electronic device displays a file management interface of a second application associated with the target selection image in response to the third sub-input of the target selection image.
In the embodiment of the present application, the file management interface includes at least one object identifier, where one object identifier corresponds to one object.
It should be noted that, the specific process of the step 301 may be described in detail in the above embodiment, and the description thereof is omitted here for avoiding repetition.
Step 302, the electronic device responds to a fourth sub-input of the first object identifier in the at least one object identifier, and displays a document preview interface of the first object corresponding to the first object identifier.
In the embodiment of the application, the document preview interface comprises a content sharing control.
Optionally, in the embodiment of the present application, the electronic device may display, on the image selection interface, a document preview interface of the first object corresponding to the first object identifier in the second manner.
Optionally, in an embodiment of the present application, the document preview interface may include document content of a document, where the document content is displayed through at least one document page.
Illustratively, in connection with scenario 2, in connection with (a) of fig. 4, as shown in (a) of fig. 10, a user may make a click input to select image 21 so that the cell phone may display a file management interface 35 of the system album in image selection interface 20 through a pop-up window, the file management interface 35 including a "select document" control 36, a "select video" control 37 and a "select file" control 38. As shown in fig. 10 (B), the user can make a click input to the "select document" control 36, as shown in fig. 10 (C), so that the mobile phone can display a document management interface 39, which document management interface 39 includes a document AI instruction manual 40 and an activity plan 41. The user can make a click input to the AI instruction manual 40, as shown in fig. 10 (D), so that the mobile phone can display a document preview interface 42 of the AI instruction manual 40, the document preview interface 42 including the document content of the AI instruction manual 38 and the content sharing control 43.
Optionally, in an embodiment of the present application, the document preview interface may support zooming and scrolling.
Optionally, in an embodiment of the present application, a page selection control may be included in each of the at least one document page, where the page selection control is used to select one document page from the at least one document page.
Optionally, before step 303, the method for sharing objects provided in the embodiment of the present application further includes the following step 304.
Step 304, the electronic device receives a fifth sub-input to the content sharing control.
Step 303, the electronic device performs screenshot on the document content of the document in response to the fifth sub-input of the content sharing control, so as to obtain at least one screenshot, and shares the at least one screenshot.
In the embodiment of the application, the electronic equipment can call the screenshot control according to the third sub-input, and perform screenshot on the document content of the document according to the function corresponding to the content sharing control, so as to obtain at least one screenshot, and directly share the at least one screenshot.
In the embodiment of the application, after at least one screenshot is obtained, the electronic device can sequentially share the at least one screenshot.
According to the embodiment of the application, the electronic equipment can directly capture the document content according to the content sharing control to obtain at least one capture and directly share the at least one capture, a user does not need to manually capture the captures, at least one capture is manually selected in sequence from the image selection interface corresponding to the first application, and the efficiency of sharing the at least one capture by the electronic equipment is improved.
Optionally, in the embodiment of the present application, the content sharing control is a page sharing control, and the "performing screenshot on document content of a document to obtain at least one screenshot" in the step 303 may be specifically implemented by the following step 303 a.
Step 303a, the electronic device performs a paging screenshot on the document content of the document to obtain at least one paging screenshot.
Optionally, in the embodiment of the present application, the electronic device may automatically capture each document page in the at least one document page according to the at least one document page occupied by the document content, so as to obtain at least one first paging capture.
Illustratively, in conjunction with fig. 10 (D), as shown in fig. 11 (a), the document management interface 42 includes a page sharing control 431, and the user may click on the page sharing control 431, as shown in fig. 11 (B), so that the mobile phone may automatically capture two document pages occupied by the document content, to obtain a page screenshot 44 and a page screenshot 45, and share the page screenshot 44 and the page screenshot 45 with the reddish chat interface 10.
Optionally, in the embodiment of the present application, the electronic device may automatically capture, through at least one document page selected by the user, each document page in the at least one document page, so as to obtain at least one second split screenshot.
Illustratively, in conjunction with fig. 11 (a), the document management interface 42 includes a page sharing control 431, as shown in fig. 12 (a), and the user may click the page selection control 45 of the second document page and the third document page of the 3 document pages occupied by the document content, so that the mobile phone may mark the page selection control 45 of the second document page and the third document page, and in fig. 12 (a), the user may click the page sharing control 431, as shown in fig. 12 (B), so that the mobile phone may automatically perform the screenshot on the second document page and the third document page, to obtain a page 46 and a page 47, and share the page 46 and the page 47 with the reddish chat interface 10.
According to the embodiment of the application, the electronic equipment can determine different screenshot modes according to the input of the user to different sharing controls, so that the flexibility of the screenshot of the electronic equipment is improved.
Optionally, in the embodiment of the present application, the content sharing control is a puzzle sharing control, and the "capturing the document content of the document to obtain at least one capturing" in the step 303 may be specifically implemented by the following step 303 b.
Step 303b, the electronic device performs long screenshot on the document content of the document to obtain a long screenshot image.
According to the embodiment of the application, the electronic equipment can start screenshot from the first document page in the at least one document page according to the at least one document page occupied by the document content and terminate at the last document page in the at least one document page, so that a long screenshot image is obtained.
Illustratively, in conjunction with fig. 10 (D), as shown in fig. 13 (a), the document management interface 37 includes a tile sharing control 432, and the user may click the tile sharing control 432, as shown in fig. 13 (B), so that the mobile phone may perform long-shot on two document pages occupied by the document content, to obtain a long-shot image 48 corresponding to the document content, and share the long-shot image 48 with the reddish chat interface 10.
Optionally, in the embodiment of the present application, the electronic device may perform screenshot on a document page selected by a user from at least one document page occupied by a document content, to obtain at least two third pagination screenshots, and splice the at least two third pagination screenshots to obtain the long screenshot image.
Illustratively, in conjunction with fig. 10 (D), as shown in fig. 14 (a), the above-mentioned document management interface 42 includes a tile sharing control 432, and the user may perform click input on the page selection control 45 of the first document page and the third document page in the 3 document pages occupied by the document content, so that the mobile phone may mark the page selection control 45 of the first document page and the third document page, and in fig. 14 (a), the user may perform click input on the tile sharing control 432, as shown in fig. 14 (B), so that the mobile phone may automatically perform tile sharing on the first document page and the third document page, obtain two split-page shots, and splice the two split-page shots to obtain a long-shot image 49, and perform the long-shot image 49 to the chat interface 10 with reddish color.
According to the embodiment of the application, the electronic equipment can determine different screenshot modes according to the input of the user to different sharing controls, so that the flexibility of the screenshot of the electronic equipment is improved.
Optionally, in an embodiment of the present application, the first object corresponding to the first object identifier is a video link, and the second input includes a sixth sub-input, a seventh sub-input, and an eighth sub-input.
In the embodiment of the present application, the sixth sub-input may be used to trigger a file management interface of a second application associated with the target selection image of the electronic device.
In the embodiment of the present application, the seventh sub-input may be used to trigger the electronic device to display a video playing interface corresponding to the video link.
In the embodiment of the present application, the eighth sub-input may be used to trigger the electronic device to share the video thumbnail.
The above step 204 may be implemented by the following steps 401 to 403, for example.
Step 401, the electronic device responds to the sixth sub-input of the target selection image, and displays a file management interface of the second application associated with the target selection image.
In the embodiment of the present application, the file management interface includes at least one object identifier, where one object identifier corresponds to one object.
It should be noted that, the specific description of the above step 401 may be detailed in the above embodiment, and the description is omitted here for avoiding repetition.
Step 402, the electronic device responds to a seventh sub-input of the first object identifier in the at least one object identifier, and displays a video playing interface of the first object corresponding to the first object identifier.
In the embodiment of the application, the video playing interface comprises video thumbnails of videos corresponding to the video links.
In the embodiment of the application, the electronic device may display the video playing interface on the image selecting interface of the first application in the second mode.
Illustratively, in conjunction with fig. 3 (B), as shown in fig. 15 (a), the user may make a click input to the selection image 23, as shown in fig. 15 (B), so that the mobile phone may display a video playing interface 50 of a video link corresponding to the video through a pop-up window in the image selection interface 20, the video playing interface 50 including a video thumbnail 51 of the video.
In step 403, the electronic device responds to the eighth sub-input of the video thumbnail, establishes a first association relationship between the video thumbnail and the video link, and shares the video thumbnail.
In the embodiment of the application, the video thumbnail includes a first association relationship.
In the embodiment of the application, the electronic device can add the video link to the header file of the video thumbnail, so that the first association relationship between the video thumbnail and the video link is established.
Optionally, in an embodiment of the present application, the video playing interface may include a fourth control, where the fourth control is used to share a video thumbnail.
Optionally, in an embodiment of the present application, the eighth sub-input may be an input to a fourth control.
Illustratively, in conjunction with fig. 15 (B), as shown in fig. 16 (a), the user may perform a click input on the fourth control 52, indicated by "complete" in fig. 16 (a), and the video thumbnail 51, as shown in fig. 16 (B), so that the mobile phone may establish a first association between the video thumbnail 51 and the video link, and share the video thumbnail 51 in the chat interface 10 with reddish.
In the embodiment of the application, the electronic device can share the video thumbnail of the video corresponding to the video link to other contacts, so that the other contacts can visually check whether the video is the required video through the video thumbnail.
Optionally, in an embodiment of the present application, the video playing interface further includes at least one comment information.
Optionally, in the embodiment of the application, each piece of comment information in the at least one piece of comment information can comprise at least one of a comment person, comment content, praise number, comment number or forwarding number and the like. Specifically, the method can be determined according to actual use requirements, and the embodiment of the application is not limited.
The above step 403 may be implemented by the following step 403a, for example.
In step 403a, the electronic device responds to the video thumbnail and the eighth sub-input of the first comment information, establishes a first association relationship between the video thumbnail and the video link, generates the video thumbnail based on the video thumbnail and the first comment information, and shares the video thumbnail.
In the embodiment of the present application, the first comment information is any one of at least one comment information.
Optionally, in the embodiment of the present application, the electronic device may perform screenshot on the video thumbnail and the first comment information respectively to obtain two interface screenshots, and then splice the two interface screenshots to obtain the video thumbnail.
Optionally, in the embodiment of the present application, the electronic device may acquire the video thumbnail and the first comment information from the video playing interface, and convert the video thumbnail and the first comment information into images through the image conversion model, so as to obtain the video thumbnail.
For example, the electronic device may convert the video thumbnail and the first comment information into a machine code, and then add picture attributes, such as length, width, and height, to the machine code, so that the video thumbnail and the first comment information may be converted into a picture.
Optionally, in an embodiment of the present application, the at least one comment information corresponds to an information selection control, where the information selection control is used to select one comment information from the at least one comment information.
Illustratively, in conjunction with fig. 16 (a), as shown in fig. 17 (a), the video playing interface 50 further includes two comment information, where each comment information corresponds to one information selection control 53, and the user may click and input the information selection control 53 and the video thumbnail picture 51 corresponding to the second comment information, so that the electronic device may mark the video thumbnail picture 51 and the information selection control 53, which is denoted by a symbol in fig. 17 (a). The user may then click on the third control 52, as shown in fig. 17 (B), so that the mobile phone may generate the video thumbnail 54 through the video thumbnail 51 and the second comment information, establish the first association between the video thumbnail 54 and the video link, and share the video thumbnail 54 in the chat interface 10 with the reddish color.
In the embodiment of the application, the electronic equipment can share the video thumbnail and comment information to other electronic equipment in the mode of the video thumbnail, so that users of other electronic equipment can directly view the video content through the video thumbnail, and the flexibility of viewing the video content by the electronic equipment is improved.
Optionally, in an embodiment of the present application, the first object corresponding to the first object identifier is an application installation package, and the second input includes a ninth sub-input and a tenth sub-input.
In the embodiment of the present application, the ninth sub-input is used for triggering the electronic device to display the file management interface of the second application associated with the target selection image.
In the embodiment of the present application, the tenth sub-input is used to trigger the electronic device to share the second image.
Illustratively, the foregoing step 204 may be implemented by specifically steps 501 and 502 described below.
Step 501, the electronic device displays a file management interface of a second application associated with the target selection image in response to the ninth sub-input of the target selection image.
In the embodiment of the present application, the file management interface includes at least one object identifier, where one object identifier corresponds to one object.
It should be noted that, the specific process of the step 501 may be described in detail in the above embodiment, and the description is omitted here for avoiding repetition.
Step 502, the electronic device shares a second image in response to a tenth sub-input of a first object identification of the at least one object identification.
Optionally, in the embodiment of the present application, the second image may be user-defined, or preset by the electronic device.
In the embodiment of the application, the electronic device can update the file coding format of the application installation package to the image coding format, and add the preset second image in the image coding format to obtain the second image. Thus, the physical layer changes the file code into the image code, and stores the original file information such as file name, file format and the like.
Illustratively, in connection with (B) of FIG. 10, as shown in (A) of FIG. 18, the user may make a click input to the "select File" control 38, as shown in (B) of FIG. 18, to enable the mobile phone to display a file selection interface 55, the file selection interface 55 including an active plan installation package identifier 56. Then, the user may perform click input on the activity plan installation package identifier 56, so that the mobile phone may update the file encoding format of the installation package corresponding to the activity plan installation package identifier 55 to an image encoding format, and add a preset second image to the image encoding format to obtain a second image 57, as shown in (C) in fig. 18, and share the second image 57 in the chat interface 10 with reddish.
According to the embodiment of the application, the electronic device can share the application installation package with other electronic devices in an image mode, so that the flexibility of the electronic device in sharing the application installation package is improved.
Optionally, in the embodiment of the present application, the method for sharing objects provided in the embodiment of the present application further includes the following step 601.
In step 601, the electronic device updates the target selection image to the object content of the first object in response to the second input.
In the embodiment of the application, the electronic device can cancel the display of the target selection image and display the object content of the first object in the display area corresponding to the target selection image.
In the case where the first object is a third image, the electronic device may display the third image in a display area corresponding to the target selection image, or
In the case that the first object is a document, the electronic device may display at least one screenshot corresponding to the document in a display area corresponding to the target selection image, or
In the case that the first object is a video link, the electronic device may display a video thumbnail corresponding to the video link in a display area corresponding to the target selection image, or
In the case where the first object is an application installation package, the electronic device may display a second image corresponding to the application installation package in a display area corresponding to the target selection image.
For example, in connection with (B) of fig. 3, as shown in fig. 19, in the case where the first object is the third image, after the user selects the third image 58, the mobile phone may cancel the display of the selection image 21 and display the third image 58 in the display area corresponding to the selection image 21.
According to the embodiment of the application, the electronic equipment can prompt the user whether the currently selected object content is required by the user or not by replacing the target selection image with the object content of the first object, so that the user can intuitively check the selected object content.
Optionally, in the embodiment of the present application, the image selection interface includes a second control, and the method for sharing objects provided in the embodiment of the present application further includes steps 701 to 704 described below.
Step 701, the electronic device receives a third input to the second control.
In the embodiment of the application, the third input is used for triggering the electronic device to display the image chat record corresponding to at least one contact in the first application.
Optionally, in the embodiment of the present application, the third input may be a click input of the user to the first control by a touch device such as a finger or a stylus, or a voice command input by the user, or a specific gesture input by the user, or other feasibility inputs, which may be specifically determined according to actual use requirements, and the embodiment of the present application is not limited.
The third input may be, for example, a single click input of the first control by the user.
Step 702, the electronic device responds to the third input to display an image chat record corresponding to at least one contact in the first application.
In the embodiment of the present application, the image chat record corresponding to each contact in the image chat record corresponding to the at least one contact includes at least one first image identifier, where each first image identifier corresponds to one first image.
In the embodiment of the application, the electronic equipment can acquire the chat record corresponding to at least one contact in the first application, and delete the text chat record in the chat record to obtain the image chat record corresponding to at least one contact in the first application.
Optionally, in the embodiment of the present application, the electronic device may display, in the second manner, an image chat record corresponding to at least one contact in the first application.
Illustratively, in conjunction with fig. 3 (B), the image selection interface 20 includes a first control 59, as shown in fig. 20 (a), and the user may make a click input on the first control 59, as shown in fig. 20 (B), so that the mobile phone may display a chat log interface 60 corresponding to the first control 59, where the chat log interface 60 includes an image chat log of the user with Zhang san and an image chat log of the user with Lisi. The image chat record of the user and Zhang San comprises three fourth image identifications, and each fourth image identification comprises an image selection control 61; the image chat record of the user with the Lifour comprises three fifth image identifications, and each fifth image identification comprises an image selection control 61.
Step 703, the electronic device receives a fourth input of a second image identification of the at least one first image identification.
In the embodiment of the present application, the fourth input is used to trigger the electronic device to share the image corresponding to the second image identifier.
Optionally, in the embodiment of the present application, the fourth input may be a click input of the user to the second image identifier by a touch device such as a finger or a stylus, or a voice command input by the user, or a specific gesture input by the user, or other feasibility inputs, which may be specifically determined according to actual use requirements, and the embodiment of the present application is not limited.
The fourth input may be, for example, a single click input of the second image identification by the user.
Optionally, in an embodiment of the present application, the chat recording interface may include a fourth control, where the fourth control is configured to trigger the electronic device to share an image corresponding to the second image identifier.
Optionally, in an embodiment of the present application, the fourth input may be an input of the second image identifier and a fourth control by a user.
In step 704, the electronic device shares the image corresponding to the second image identifier in response to the fourth input.
In the embodiment of the application, the electronic device can send the image corresponding to the second image identifier to other electronic devices through the first application so as to share the image corresponding to the second image identifier.
Illustratively, in conjunction with (B) in fig. 20, as shown in (a) in fig. 21, the user may make a click input on the image selection control 61 corresponding to the image identifier 62 of the three image identifiers corresponding to Zhang three, so that the mobile phone may mark the image identifier 61, indicated by ∈r in (a) in fig. 21. The user may then make a click input to the fourth control 63, as shown in fig. 21 (B), so that the handset may display the image 63 corresponding to the image identifier 62 in the message display area 34 in the chat interface 10.
Optionally, in an embodiment of the present application, the second image identifier includes a first sub-image identifier and a second sub-image identifier.
Illustratively, the above step 704 may be implemented specifically by a step 704a described below.
In step 604a, the electronic device, in response to the fourth input, concatenates the image corresponding to the first sub-image identifier and the image corresponding to the second sub-image identifier to obtain a target image, and shares the target image.
Optionally, in an embodiment of the present application, the fourth input may be an input of an image selection control corresponding to the first sub-image identifier, an image selection control corresponding to the second sub-image identifier, and a fourth control.
Illustratively, in conjunction with (B) in fig. 20, as shown in (a) in fig. 22, the user may make a click input on the image selection control 61 corresponding to the image identifier 62 of the three image identifiers corresponding to Zhang san and the image selection control 61 corresponding to the image identifier 64 of the three image identifiers corresponding to Lisi, so that the mobile phone may mark the image identifier 61, indicated by ∈v in (a) in fig. 22. The user may then enter a click on the fourth control 55, as shown in fig. 22 (B), so that the handset may stitch the image 63 corresponding to the image identifier 62 and the image 65 corresponding to the image identifier 64, resulting in a target image 66, wherein template image 66 also includes contact information for send image 63 and image 65, as shown in the figures as "picture of four of plum" and "picture of three", and the target image 66 is displayed in message display area 34 in chat interface 10.
According to the embodiment of the application, the electronic equipment can search the images required to be shared by the user from the image chat records of the contacts, so that the flexibility of searching the images by the electronic equipment is improved.
As shown in fig. 23, the object sharing method provided by the embodiment of the present application is specifically explained below by way of a specific example. Specifically, the method can be realized by the following steps 1 to 6.
And step 1, clicking and inputting a first control of the first application by a user.
And 2, the electronic equipment responds to click input of the first control, and an image selection interface corresponding to the picture selector of the first application is displayed.
In an embodiment of the present application, the image selection interface includes at least one selection image, and each selection image in the at least one selection image is associated with a second application.
And 3, clicking and inputting the target selection image in the at least one selection image by the user.
And 4, the electronic equipment responds to click input of the target selection image, and displays a picture selector of a second application associated with the target selection image.
In an embodiment of the present application, the picture selector of the second application includes at least one object identifier, where each object identifier in the at least one object identifier corresponds to one object.
And 5, clicking and inputting a first object identifier in the at least one object identifier by a user.
And 6, the electronic equipment responds to click input of the first object identifier, imports the first object corresponding to the first object identifier into the first application, and shares the object content of the first object.
In the embodiment of the application, the electronic equipment can break the limitation of application by selecting the image, call a stronger and more habitual picture selector, facilitate the user to quickly select the picture required by the user, and the problems of authorizing the application and information leakage are solved without modifying the authorization range each time because the image is selected by automatically inserting the entry for calling the system picture selector, namely the image is selected, under the condition that the electronic equipment does not give the application access to the album authority.
The above-mentioned method embodiments, or various possible implementation manners in the method embodiments, may be executed alone or may be executed in combination with each other on the premise that there is no contradiction, and may be specifically determined according to actual use requirements, which is not limited by the embodiment of the present application.
It should be noted that, in the object sharing method provided by the embodiment of the present application, the execution subject may be an object sharing device. In the embodiment of the present application, an object sharing method performed by an object sharing device is taken as an example, and the object sharing device provided by the embodiment of the present application is described.
Fig. 24 is a schematic diagram of a possible structure of an object sharing apparatus according to an embodiment of the present application. As shown in fig. 24, the object sharing apparatus 70 may include a receiving module 71, a display module 72, and a sharing module 73.
Wherein the receiving module 71 is configured to receive a first input to a first control in a first application. The display module 72 is configured to display at least one selection image on the image selection interface corresponding to the first application, where one selection image of the at least one selection image is associated with one application program, in response to the first input received by the receiving module 71. The receiving module 71 is further configured to receive a second input for selecting a target selection image of the at least one selection image. The sharing module 73 is configured to share the object content of at least one object in the second application associated with the target selection image in response to the second input received by the receiving module 71.
In one possible implementation, the second input includes a first sub-input and a second sub-input.
The display module 72 is specifically configured to display, in response to the first sub-input of the target selection image received by the receiving module 71, a file management interface of the second application associated with the target selection image, where the file management interface includes at least one object identifier, and one object identifier corresponds to one object. The sharing module 73 is specifically configured to share, in response to the second sub-input of the first object identifier in the at least one object identifier received by the receiving module 71, object content of the first object corresponding to the first object identifier.
In a possible implementation manner, the first object corresponding to the first object identifier is a document, the second input includes a third sub-input, a fourth sub-input, and a fifth sub-input, and the object sharing apparatus 70 provided in the embodiment of the present application further includes a processing module. The display module 72 is specifically configured to display, in response to the third sub-input of the target selection image received by the receiving module 71, a file management interface of the second application associated with the target selection image, where the file management interface includes at least one object identifier, and one object identifier corresponds to one object, and display, in response to the fourth sub-input of the first object identifier received by the receiving module 71, a document preview interface of the first object corresponding to the first object identifier, where the document preview interface includes a content sharing control. And the processing module is used for responding to the fifth sub-input of the receiving module 71 to the content sharing control, and capturing the document content of the document to obtain at least one capturing. The sharing module 73 is specifically configured to share at least one screenshot.
In one possible implementation, the content sharing control is a paging sharing control. The processing module is specifically configured to perform a pagination screenshot on document content of a document to obtain at least one pagination screenshot, or perform a long screenshot on document content of the document to obtain a long screenshot image.
In a possible implementation manner, the first object corresponding to the first object identifier is a video link, and the second input includes a sixth sub-input, a seventh sub-input, and an eighth sub-input, and the object sharing device 70 provided in the embodiment of the present application further includes a processing module. The display module 72 is specifically configured to display, in response to the sixth sub-input of the target selection image received by the receiving module 71, a file management interface of the second application associated with the target selection image, where the file management interface includes at least one object identifier, and one object identifier corresponds to one object, and display, in response to the seventh sub-input of the first object identifier received by the receiving module 71, a video playing interface of the first object corresponding to the first object identifier, where the video playing interface includes a video thumbnail of a video corresponding to the video link. And a processing module, configured to establish a first association relationship between the video thumbnail and the video link in response to the eighth sub-input of the video thumbnail received by the receiving module 71. The sharing module 73 is specifically configured to share a video thumbnail, where the video thumbnail carries a first association relationship.
In a possible implementation manner, the display module is further configured to update the target selection image to the object content of the first object in response to the second input received by the receiving module 71.
In one possible implementation, the image selection interface includes a second control. The receiving module 71 is further configured to receive a third input to the second control. The display module 72 is further configured to display, in response to the third input received by the receiving module, an image chat record corresponding to at least one contact in the first application, where each image chat record corresponding to at least one contact includes at least one first image identifier, and each first image identifier corresponds to one first image. The receiving module 71 is further configured to receive a fourth input of a second image identifier of the at least one first image identifier. The sharing module 73 is further configured to share the image corresponding to the second image identifier in response to the fourth input received by the receiving module 71.
The embodiment of the application provides an object sharing device, which is characterized in that an application program is associated with a selected image, so that the object sharing device can directly call the application program corresponding to the selected image through the selected image displayed in an image selection interface, and therefore, the object sharing device can directly import the selected object selected by a user in the application program into a first application and directly share the object, namely, when the image selection interface of the first application does not comprise the object required to be shared by the user, the object required by the user in other application programs is not required to be stored in an album application program, then the object is imported into the image selection interface through the album application program, and the object is further shared, so that the object searching step of the object sharing device is simplified, and the object sharing efficiency of the object sharing device is further improved.
The object sharing device in the embodiment of the application can be an electronic device, and also can be a component in the electronic device, such as an integrated circuit or a chip. The electronic device may be a terminal, or may be other devices than a terminal. The Mobile electronic device may be a Mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted electronic device, a Mobile internet appliance (Mobile INTERNET DEVICE, MID), an augmented reality (augmented reality, AR)/Virtual Reality (VR) device, a robot, a wearable device, an ultra-Mobile personal computer (UMPC), a netbook or a personal digital assistant (personal DIGITAL ASSISTANT, PDA), etc., and may also be a server, a network attached storage (Network Attached Storage, NAS), a personal computer (personal computer, PC), a Television (TV), a teller machine, a self-service machine, etc., which are not particularly limited in the embodiments of the present application.
The object sharing device in the embodiment of the application may be a device with an operating system. The operating system may be an Android operating system, an iOS operating system, or other possible operating systems, and the embodiment of the present application is not limited specifically.
The object sharing device provided by the embodiment of the present application can implement each process implemented by the above embodiment, and in order to avoid repetition, details are not repeated here.
Optionally, as shown in fig. 25, the embodiment of the present application further provides an electronic device 90, which includes a processor 91 and a memory 92, where a program or an instruction that can be executed on the processor 91 is stored in the memory 92, and when the program or the instruction is executed by the processor 91, the steps of the above-mentioned embodiment of the object sharing method are implemented, and the same technical effects can be achieved, so that repetition is avoided, and no further description is given here.
The electronic device in the embodiment of the application includes the mobile electronic device and the non-mobile electronic device.
Fig. 26 is a schematic hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 100 includes, but is not limited to, a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, a display unit 106, a user input unit 107, an interface unit 108, a memory 109, and a processor 110.
Those skilled in the art will appreciate that the electronic device 100 may further include a power source (e.g., a battery) for powering the various components, and that the power source may be logically coupled to the processor 110 via a power management system to perform functions such as managing charging, discharging, and power consumption via the power management system. The electronic device structure shown in fig. 26 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown in the drawings, or may combine some components, or may be arranged in different components, which will not be described in detail herein.
Wherein the user input unit 107 is configured to receive a first input to a first control in a first application. And the display unit 106 is used for responding to the first input and displaying at least one selection image on the image selection interface corresponding to the first application, wherein one selection image is associated with one application program. The user input unit 107 is further configured to receive a second input for selecting a target selection image of the at least one selection image. The processor 110 is configured to share object content of at least one object in the second application associated with the target selection image in response to the second input.
Optionally, in an embodiment of the present application, the second input includes a first sub-input and a second sub-input. The display unit 106 is specifically configured to display, in response to the first sub-input of the target selection image, a file management interface of the second application associated with the target selection image, where the file management interface includes at least one object identifier, and one object identifier corresponds to one object. The processor 110 is specifically configured to share, in response to a second sub-input of a first object identifier of the at least one object identifier, object content of a first object corresponding to the first object identifier.
Optionally, in the embodiment of the present application, the first object corresponding to the first object identifier is a document, and the second input includes a third sub-input, a fourth sub-input and a fifth sub-input. The display unit 106 is specifically configured to display, in response to a third sub-input of the target selection image, a file management interface of the second application associated with the target selection image, where the file management interface includes at least one object identifier, one object identifier corresponds to one object, and in response to a fourth sub-input of the first object identifier in the at least one object identifier, display a document preview interface of the first object corresponding to the first object identifier, where the document preview interface includes a content sharing control. The processor 110 is specifically configured to, in response to a fifth sub-input to the content sharing control, perform screenshot on document content of the document, obtain at least one screenshot, and share the at least one screenshot.
Optionally, in the embodiment of the present application, the content sharing control is a pagination sharing control, and the processor 110 is specifically configured to perform pagination screenshot on document content of a document to obtain at least one pagination screenshot, or the content sharing control is a jigsaw sharing control, and the processor 110 is specifically configured to perform long screenshot on document content of the document to obtain a long screenshot image.
Optionally, in the embodiment of the present application, the first object corresponding to the first object identifier is a video link, and the second input includes a sixth sub-input, a seventh sub-input, and an eighth sub-input. The display unit 106 is specifically configured to display, in response to a sixth sub-input of the target selection image, a file management interface of the second application associated with the target selection image, where the file management interface includes at least one object identifier, one object identifier corresponds to one object, and in response to a seventh sub-input of the first object identifier in the at least one object identifier, display a video playing interface of the first object corresponding to the first object identifier, where the video playing interface includes a video thumbnail corresponding to the video in the video link. The above-mentioned processor 110 is specifically configured to, in response to the eighth sub-input of the video thumbnail, establish a first association relationship between the video thumbnail and the video link, and share the video thumbnail, where the video thumbnail includes the first association relationship.
Optionally, in an embodiment of the present application, the display unit 106 is further configured to update the target selection image to the object content of the first object in response to the second input.
Optionally, in the embodiment of the present application, the image selection interface includes a second control, and the user input unit 107 is further configured to receive a third input to the second control. The display unit 106 is further configured to display, in response to the third input, an image chat record corresponding to at least one contact in the first application, where each image chat record corresponding to at least one contact includes at least one first image identifier, and each first image identifier corresponds to one first image. The user input unit 107 is further configured to receive a fourth input of a second image identifier of the at least one first image identifier. The processor 110 is further configured to share, in response to the fourth input, an image corresponding to the second image identifier.
The embodiment of the application provides electronic equipment, which is characterized in that an application program is associated with one selected image, so that the electronic equipment can directly call the application program corresponding to the selected image through the selected image displayed in an image selection interface, the electronic equipment can directly import a selected object selected by a user in the application program into a first application and directly share the object, namely, when the image selection interface of the first application does not comprise the object required to be shared by the user, the objects required by the user in other application programs are not required to be stored in an album application program, then the objects are imported into the image selection interface through the album application program, and the objects are further shared, so that the step of searching the objects by the electronic equipment is simplified, and the efficiency of sharing the objects by the electronic equipment is further improved.
The electronic device provided by the embodiment of the application can realize each process realized by the embodiment of the method and can achieve the same technical effect, and in order to avoid repetition, the description is omitted here.
The beneficial effects of the various implementation manners in this embodiment may be specifically referred to the beneficial effects of the corresponding implementation manners in the foregoing method embodiment, and in order to avoid repetition, the description is omitted here.
It should be appreciated that in embodiments of the present application, the input unit 104 may include a graphics processor (Graphics Processing Unit, GPU) 1041 and a microphone 1042, the graphics processor 1041 processing image data of still images or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 107 includes at least one of a touch panel 1071 and other input devices 1072. The touch panel 1071 is also referred to as a touch screen. The touch panel 1071 may include two parts of a touch detection device and a touch controller. Other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and so forth, which are not described in detail herein.
Memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a first memory area storing programs or instructions and a second memory area storing data, wherein the first memory area may store an operating system, application programs or instructions (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, the memory 109 may include volatile memory or nonvolatile memory, or the memory 109 may include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM), static random access memory (STATIC RAM, SRAM), dynamic random access memory (DYNAMIC RAM, DRAM), synchronous Dynamic Random Access Memory (SDRAM), double data rate Synchronous dynamic random access memory (Double DATA RATE SDRAM, DDRSDRAM), enhanced Synchronous dynamic random access memory (ENHANCED SDRAM, ESDRAM), synchronous link dynamic random access memory (SYNCH LINK DRAM, SLDRAM), and Direct random access memory (DRRAM). Memory 109 in embodiments of the present application includes, but is not limited to, these and any other suitable types of memory.
Processor 110 may include one or more processing units, and optionally, processor 110 integrates an application processor that primarily processes operations involving an operating system, user interface, application programs, etc., and a modem processor that primarily processes wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The embodiment of the application also provides a readable storage medium, on which a program or an instruction is stored, which when executed by a processor, implements each process of the above method embodiment, and can achieve the same technical effects, and in order to avoid repetition, the description is omitted here.
Wherein the processor is a processor in the electronic device described in the above embodiment. The readable storage medium includes computer readable storage medium such as computer readable memory ROM, random access memory RAM, magnetic or optical disk, etc.
The embodiment of the application further provides a chip, which comprises a processor and a communication interface, wherein the communication interface is coupled with the processor, and the processor is used for running programs or instructions to realize the processes of the embodiment of the method, and can achieve the same technical effects, so that repetition is avoided, and the description is omitted here.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
Embodiments of the present application provide a computer program product stored in a storage medium, where the program product is executed by at least one processor to implement the respective processes of the above-described object sharing method embodiment, and achieve the same technical effects, and for avoiding repetition, a detailed description is omitted herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a computer software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those having ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are to be protected by the present application.