赞
踩
本示例主要展示了相机的相关功能,使用libohcamera.so 接口实现相机的预览、拍照、录像、前后置摄像头切换进行拍照、录像,以及对焦、曝光等控制类功能。
使用说明
相机功能接口实现在CameraManager.cpp中
在NDKCamera构造函数里完成一个相机生命周期初始化的过程,包括调用OH_Camera_GetCameraMananger获取CameraMananger,调用OH_CameraManager_CreateCaptureSession创建CaptureSession,调用CaptureSessionRegisterCallback创建CaptureSession注册回调,调用GetSupportedCameras获取支持的camera设备,调用GetSupportedOutputCapability获取支持的camera设备能力集,调用CreatePreviewOutput创建预览输出,调用CreateCameraInput创建相机输入,调用CameraInputOpen打开相机输入,调用CameraManagerRegisterCallback创建CameraManager注册回调,最后调用SessionFlowFn开启Session。
其中SessionFlowFn是一个开启预览的动作,主要流程包括:调用OH_CaptureSession_BeginConfig开始配置会话,调用OH_CaptureSession_AddInput把CameraInput加入到会话,调用OH_CaptureSession_AddPreviewOutput把previewOutput加入到会话,调用OH_CaptureSession_CommitConfig提交配置信息,调用OH_CaptureSession_Start开始会话工作,还有一步是在开启预览的同时调用IsFocusMode启动对焦功能,这边后面会涉及到。
在NDKCamera析构函数里完成对相机生命周期释放的过程,调用OH_CameraManager_DeleteSupportedCameras删除支持的camera设备,调用OH_CameraManager_DeleteSupportedCameraOutputCapability删除支持的camera设备能力集,调用OH_Camera_DeleteCameraManager删除camera manager。
拍照功能相关接口封装在StartPhoto接口中,主要包含以下流程:调用SessionStop关闭session,调用SessionBegin做session的一个预置动作,调用CreatePhotoOutput创建相机输出,调用OH_CaptureSession_AddPhotoOutput将hotoOutput添加至session中,调用SessionCommitConfig提交session,在调用SessionStart开启session,最后调用TakePicture接口开启拍照动作。
录像功能相关接口封装在StartVideo接口中,主要包含以下流程:调用SessionStop关闭session,调用SessionBegin做session的一个预置动作,调用OH_CaptureSession_RemovePhotoOutput移除相机拍照输出,再调用CreatePhotoOutput创建相机输出,调用AddPhotoOutput将相机输出添加至session中,调用CreateVideoOutput创建录像输出,调用AddVideoOutput将录像输出添加至session中,然后再调用SessionCommitConfig、SessionStart对session进行提交和开启,最后调用VideoOutputRegisterCallback对VideoOutput注册回调。
曝光功能相关接口封装在IsExposureModeSupportedFn接口中,主要包含以下流程:调用OH_CaptureSession_IsExposureModeSupported判断是否支持曝光模式,然后调用OH_CaptureSession_SetExposureMode设置曝光模式,调用OH_CaptureSession_GetExposureMode获取设置后的曝光模式。调用IsExposureBiasRange接口获取曝光补偿,其中包含调用OH_CaptureSession_GetExposureBiasRange获取曝光补偿的范围,调用OH_CaptureSession_SetExposureBias设置曝光点,调用OH_CaptureSession_GetExposureBias获取曝光点。
对焦功能相关接口封装在IsFocusMode接口中,主要包含以下流程:调用OH_CaptureSession_IsFocusModeSupported判断是否支持对焦模式,调用OH_CaptureSession_SetFocusMode设置对焦模式,调用OH_CaptureSession_GetFocusMode获取设置后的对焦模式。
调用IsFocusPoint接口获取对焦点,其中包括调用OH_CaptureSession_SetFocusPoint获取JS侧下发来的对焦点位,然后调用OH_CaptureSession_GetFocusPoint获取设置后的对焦点位。
视频防抖功能相关接口封装在IsVideoStabilizationModeSupportedFn接口中,主要包含以下流程:调用OH_CaptureSession_IsVideoStabilizationModeSupported接口查询是否支持指定的视频防抖模式,调用OH_CaptureSession_SetVideoStabilizationMode设置视频防抖,调用OH_CaptureSession_GetVideoStabilizationMode获取设置后的视频防抖模式。
回调接口设置:
相机预览、拍照、录像功能、前后置切换功能实现调用侧位于tableIndex.ets,modeSwitchPage.ets,main.cpp中,源码参考:[Index.ets]
/* * Copyright (c) 2024 Huawei Device Co., Ltd. * Licensed under the Apache License, Version 2.0 (the 'License'); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an 'AS IS' BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ import { abilityAccessCtrl } from '@kit.AbilityKit'; import { common } from '@kit.AbilityKit'; import { display } from '@kit.ArkUI'; import { photoAccessHelper } from '@kit.MediaLibraryKit'; import { dataSharePredicates } from '@kit.ArkData'; import { image } from '@kit.ImageKit'; import cameraDemo from 'libentry.so'; import Logger from '../common/utils/Logger'; import { DividerPage } from '../views/DividerPage'; import { ModeSwitchPage } from '../views/ModeSwitchPage'; import { FocusPage } from '../views/FocusPage'; import { FocusAreaPage } from '../views/FocusAreaPage'; import { Constants } from '../common/Constants'; import { SettingDataObj } from '../common/Constants'; import DisplayCalculator from '../common/DisplayCalculator'; const TAG: string = 'UI indexPage'; let context = getContext(this) as common.UIAbilityContext; @Entry @Component struct Index { // surfaceID value. @State surfaceId: string = ''; // Select mode. @State modelBagCol: string = 'photo'; // Exposure area. @State focusPointBol: boolean = false; // Finger click coordinates in the exposure area. @State focusPointVal: Array<number> = [0, 0]; // Display where scale, focal length value, and focus box cannot coexist. @State exposureBol: boolean = true; // Exposure value. @State exposureNum: number = 0; // Countdown, photography, and video recording. @State countdownNum: number = 0; // Front and rear cameras. @State cameraDeviceIndex: number = 0; @State xComponentWidth: number = 384; @State xComponentHeight: number = 450; // Reference line. @State referenceLineBol: boolean = false; @StorageLink('defaultAspectRatio') @Watch('initXComponentSize') defaultAspectRatio: number = Constants.MIN_ASPECT_RATIO; @State onShow: boolean = false; // Thumbnails @StorageLink('thumbnail') thumbnail: image.PixelMap | undefined | string = ''; // XComponentController. private mXComponentController: XComponentController = new XComponentController(); private screenHeight: number = 0; private screenWidth: number = 0; private settingDataObj: SettingDataObj = { mirrorBol: false, videoStabilizationMode: 0, exposureMode: 1, focusMode: 2, photoQuality: 1, locationBol: false, photoFormat: 1, photoOrientation: 0, photoResolution: 0, videoResolution: 0, videoFrame: 0, referenceLineBol: false }; private appContext: common.Context = getContext(this); atManager = abilityAccessCtrl.createAtManager(); // Entry initialization function. async aboutToAppear() { await this.requestPermissionsFn(); let mDisplay = display.getDefaultDisplaySync(); this.screenWidth = px2vp(mDisplay.width); this.screenHeight = px2vp(mDisplay.height); this.initXComponentSize(); } initXComponentSize(): void { let defaultSize = DisplayCalculator.calcSurfaceDisplaySize(this.screenWidth, this.screenHeight, this.defaultAspectRatio); this.xComponentWidth = defaultSize.width; this.xComponentHeight = defaultSize.height; } async aboutToDisAppear() { cameraDemo.releaseCamera(); } // Obtain permissions. async requestPermissionsFn() { Logger.info(TAG, `requestPermissionsFn entry`); try { this.atManager.requestPermissionsFromUser(this.appContext, [ 'ohos.permission.CAMERA', 'ohos.permission.MICROPHONE', 'ohos.permission.READ_MEDIA', 'ohos.permission.WRITE_MEDIA', 'ohos.permission.WRITE_IMAGEVIDEO', 'ohos.permission.READ_IMAGEVIDEO' ]).then(() => { Logger.info(TAG, `request Permissions success!`); this.onShow = true; this.getThumbnail(); }); } catch (err) { Logger.error(TAG, `requestPermissionsFromUser call Failed! error: ${err.code}`); } } async getThumbnail() { let phAccessHelper = photoAccessHelper.getPhotoAccessHelper(context); let predicates: dataSharePredicates.DataSharePredicates = new dataSharePredicates.DataSharePredicates(); let fetchOptions: photoAccessHelper.FetchOptions = { fetchColumns: [], predicates: predicates }; let fetchResult: photoAccessHelper.FetchResult<photoAccessHelper.PhotoAsset> = await phAccessHelper.getAssets(fetchOptions); let asset: photoAccessHelper.PhotoAsset = await fetchResult.getLastObject(); console.info('asset displayName = ', asset.displayName); asset.getThumbnail((err, pixelMap) => { if (err === undefined) { this.thumbnail = pixelMap; console.info('getThumbnail successful ' + pixelMap); } else { console.error(`getThumbnail fail with error: ${err.code}, ${err.message}`); } }); } async onPageShow() { Logger.info(TAG, `onPageShow App`); if (this.surfaceId && this.onShow) { Logger.error(TAG, `initCamera start`); cameraDemo.initCamera(this.surfaceId, this.settingDataObj.focusMode, this.cameraDeviceIndex); Logger.error(TAG, `initCamera end`); } this.getThumbnail(); } onPageHide() { Logger.info(TAG, `onPageHide App`); this.thumbnail = '' cameraDemo.releaseCamera(); } build() { Stack() { if (this.onShow) { // General appearance of a picture. XComponent({ id: 'componentId', type: 'surface', controller: this.mXComponentController }) .onLoad(async () => { Logger.info(TAG, 'onLoad is called'); this.surfaceId = this.mXComponentController.getXComponentSurfaceId(); Logger.info(TAG, `onLoad surfaceId: ${this.surfaceId}`); Logger.info(TAG, `initCamera start`); cameraDemo.initCamera(this.surfaceId, this.settingDataObj.focusMode, this.cameraDeviceIndex); Logger.info(TAG, `initCamera end`); }) .backgroundColor(Color.Black) .width(Constants.FULL_PERCENT) .height(Constants.SEVENTY_PERCENT) .margin({ bottom: Constants.FIFTEEN_PERCENT }) // Reference line. DividerPage({ referenceLineBol: this.referenceLineBol }); // Exposure frame and focus frame. FocusPage({ focusPointBol: $focusPointBol, focusPointVal: $focusPointVal, exposureBol: $exposureBol, exposureNum: $exposureNum }); // Exposure focusing finger click area. FocusAreaPage({ focusPointBol: $focusPointBol, focusPointVal: $focusPointVal, exposureBol: $exposureBol, exposureNum: $exposureNum, xComponentWidth: this.xComponentWidth, xComponentHeight: this.xComponentHeight }); // Reverse camera_Multiple workstations_Take photos_Video. ModeSwitchPage({ surfaceId: this.surfaceId, cameraDeviceIndex: $cameraDeviceIndex, countdownNum: $countdownNum }); } } .height(Constants.FULL_PERCENT) .width(Constants.FULL_PERCENT) .backgroundColor(Color.Black) } }
/* * Copyright (c) 2023 Huawei Device Co., Ltd. * Licensed under the Apache License, Version 2.0 (the 'License'); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an 'AS IS' BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ // Reverse camera_ Multiple workstations_ Take photos Video. import { photoAccessHelper } from '@kit.MediaLibraryKit'; import { dataSharePredicates } from '@kit.ArkData'; import { fileIo } from '@kit.CoreFileKit'; import { BusinessError, deviceInfo } from '@kit.BasicServicesKit'; import { common } from '@kit.AbilityKit'; import { image } from '@kit.ImageKit'; import { media } from '@kit.MediaKit'; import cameraDemo from 'libentry.so'; import Logger from '../common/utils/Logger'; import MediaUtils from '../common/utils/MediaUtils'; import { SettingDataObj } from '../common/Constants' import { Constants } from '../common/Constants' let context = getContext(this) as common.UIAbilityContext; interface PhotoSettings { quality: number, // Photo quality rotation: number, // Photo direction mirror: boolean, // Mirror Enable latitude: number, // geographic location longitude: number, // geographic location altitude: number // geographic location }; interface PhotoRotationMap { rotation0: number, rotation90: number, rotation180: number, rotation270: number, }; @Component export struct ModeSwitchPage { @State videoId: string = ''; @State mSurfaceId: string = ''; // Front and rear cameras @Link cameraDeviceIndex: number; // SurfaceID @Prop surfaceId: string; // Countdown value @Link countdownNum: number; // Countdown timer @State countTimerInt: number = -1; @State countTimerOut: number = -1; // Recording time @State videoRecodeTime: number = 0; // Recording time timer @State timer: number = -1; // Select mode @State modelBagCol: string = Constants.PHOTO; // Choose camera or capture @State @Watch('onChangeIsModeBol') isModeBol: boolean = true; // Thumbnails @StorageLink('thumbnail') thumbnail: image.PixelMap | undefined | string = ''; private tag: string = 'sample modeSwitchPage:'; private mediaUtil = MediaUtils.getInstance(); private photoAsset?: string; private fd: number = -1; private cameraSize: image.Size = { width: 1280, height: 720 }; private photoSettings: PhotoSettings = { quality: 0, rotation: 0, mirror: false, latitude: Constants.LATITUDE, longitude: Constants.LONGITUDE, altitude: Constants.ALTITUDE }; private mReceiver?: image.ImageReceiver; private videoRecorder?: media.AVRecorder; private videoConfig: media.AVRecorderConfig = { audioSourceType: media.AudioSourceType.AUDIO_SOURCE_TYPE_MIC, videoSourceType: media.VideoSourceType.VIDEO_SOURCE_TYPE_SURFACE_YUV, profile: { audioBitrate: Constants.AUDIO_BITRATE_SAMPLE_RATE, audioChannels: Constants.AUDIO_CHANNELS, audioCodec: media.CodecMimeType.AUDIO_AAC, audioSampleRate: Constants.AUDIO_BITRATE_SAMPLE_RATE, fileFormat: media.ContainerFormatType.CFT_MPEG_4, videoBitrate: Constants.VIDEO_BITRATE, videoCodec: media.CodecMimeType.VIDEO_AVC, videoFrameWidth: Constants.VIDEO_FRAME_WIDTH, videoFrameHeight: Constants.VIDEO_FRAME_HEIGHT, videoFrameRate: Constants.VIDEO_FRAME_RATE }, url: '', metadata: { videoOrientation: '' } }; private photoRotationMap: PhotoRotationMap = { rotation0: 0, rotation90: 90, rotation180: 180, rotation270: 270, }; private settingDataObj: SettingDataObj = { mirrorBol: false, videoStabilizationMode: 0, exposureMode: 1, focusMode: 2, photoQuality: 1, locationBol: false, photoFormat: 1, photoOrientation: 0, photoResolution: 0, videoResolution: 0, videoFrame: 0, referenceLineBol: false }; // After pausing, click 'stop' to reset the pause to default. onChangeIsModeBol() { } // Countdown capture and video. countTakeVideoFn() { if (this.countdownNum) { // Clear Countdown. if (this.countTimerOut) { clearTimeout(this.countTimerOut); } if (this.countTimerInt) { clearInterval(this.countTimerInt); } // Turn on timer. this.countTimerOut = setTimeout(() => { // Determine whether it is in video or photo mode. this.isVideoPhotoFn(); }, this.countdownNum * 1000) // Turn on timer. this.countTimerInt = setInterval(() => { this.countdownNum--; if (this.countdownNum === 0) { clearInterval(this.countTimerInt); } }, 1000) } else { this.isVideoPhotoFn(); } } async getVideoSurfaceID() { Logger.info(this.tag, `getVideoSurfaceID`); this.videoRecorder = await media.createAVRecorder(); Logger.info(this.tag, `getVideoSurfaceID videoRecorder: ${this.videoRecorder}`); this.photoAsset = await this.mediaUtil.createAndGetUri(photoAccessHelper.PhotoType.VIDEO); Logger.info(this.tag, `getVideoSurfaceID photoAsset: ${this.photoAsset}`); this.fd = await this.mediaUtil.getFdPath(this.photoAsset); Logger.info(this.tag, `getVideoSurfaceID fd: ${this.fd}`); this.videoConfig.url = `fd://${this.fd}`; Logger.info(this.tag, `getVideoSurfaceID videoConfig.url : ${this.videoConfig.url}`); if (deviceInfo.deviceType === Constants.DEFAULT) { Logger.info(this.tag, `deviceType = default`); this.videoConfig.videoSourceType = media.VideoSourceType.VIDEO_SOURCE_TYPE_SURFACE_ES; } if (deviceInfo.deviceType === Constants.PHONE) { Logger.info(this.tag, `deviceType = phone`) this.videoConfig.videoSourceType = media.VideoSourceType.VIDEO_SOURCE_TYPE_SURFACE_YUV; this.videoConfig.profile.videoCodec = media.CodecMimeType.VIDEO_AVC; if (this.cameraDeviceIndex === 1) { this.videoConfig.metadata = { videoOrientation: '270' }; } else { this.videoConfig.metadata = { videoOrientation: '90' }; } } if (deviceInfo.deviceType === 'tablet') { Logger.info(this.tag, `deviceType = tablet`); this.videoConfig.videoSourceType = media.VideoSourceType.VIDEO_SOURCE_TYPE_SURFACE_YUV; } this.videoConfig.profile.videoFrameWidth = cameraDemo.getVideoFrameWidth(); this.videoConfig.profile.videoFrameHeight = cameraDemo.getVideoFrameHeight(); this.videoConfig.profile.videoFrameRate = cameraDemo.getVideoFrameRate(); await this.videoRecorder.prepare(this.videoConfig); this.videoId = await this.videoRecorder.getInputSurface(); Logger.info(this.tag, `getVideoSurfaceID videoId: ${this.videoId}`); } createImageReceiver() { try { this.mReceiver = image.createImageReceiver(this.cameraSize, 2000, 8); Logger.info(this.tag, `createImageReceiver value: ${this.mReceiver} `); this.mReceiver.on('imageArrival', () => { Logger.info(this.tag, 'imageArrival start'); if (this.mReceiver) { this.mReceiver.readNextImage((err, image) => { Logger.info(this.tag, 'readNextImage start'); if (err || image === undefined) { Logger.error(this.tag, 'readNextImage failed '); return; } image.getComponent(4, (errMsg, img) => { Logger.info(this.tag, 'getComponent start'); if (errMsg || img === undefined) { Logger.info(this.tag, 'getComponent failed '); return; } let buffer = new ArrayBuffer(2048); if (img.byteBuffer) { buffer = img.byteBuffer; } else { Logger.error(this.tag, 'img.byteBuffer is undefined'); } this.savePicture(buffer, image); }) }) } }) } catch { Logger.info(this.tag, 'savePicture err'); } } // Read Image. async savePicture(buffer: ArrayBuffer, img: image.Image) { try { Logger.info(this.tag, 'savePicture start'); let photoAssetUri: string = await this.mediaUtil.createAndGetUri(photoAccessHelper.PhotoType.IMAGE); let imgPhotoUri: string = photoAssetUri; Logger.info(this.tag, `photoUri = ${imgPhotoUri}`); let imgFd = await this.mediaUtil.getFdPath(imgPhotoUri); Logger.info(this.tag, `fd = ${imgFd}`); fileIo.writeSync(imgFd, buffer); fileIo.closeSync(imgFd); await img.release(); Logger.info(this.tag, 'save image End'); setTimeout(() => { if (this.handleTakePicture) { this.handleTakePicture(imgPhotoUri); } }, 10) } catch (err) { Logger.info(this.tag, 'savePicture err' + JSON.stringify(err.message)); } } async getPhotoSurfaceID() { if (this.mReceiver) { Logger.info(this.tag, 'imageReceiver has been created'); } else { this.createImageReceiver(); } if (this.mReceiver) { this.mSurfaceId = await this.mReceiver.getReceivingSurfaceId(); } if (this.mSurfaceId) { Logger.info(this.tag, `createImageReceiver mSurfaceId: ${this.mSurfaceId} `); } else { Logger.info(this.tag, `Get mSurfaceId failed `); } } // Determine the video or photo mode. async isVideoPhotoFn() { await this.getPhotoSurfaceID(); if (this.modelBagCol === Constants.PHOTO) { cameraDemo.startPhotoOrVideo(this.modelBagCol, this.videoId, this.mSurfaceId); } else if (this.modelBagCol === Constants.VIDEO) { this.isModeBol = false; if (this.timer) { clearInterval(this.timer); } // Start record. await this.getVideoSurfaceID(); cameraDemo.startPhotoOrVideo(this.modelBagCol, this.videoId, this.mSurfaceId); cameraDemo.videoOutputStart(); if (this.videoRecorder) { this.videoRecorder.start(); } } } async handleTakePicture(thumbnail: string) { this.thumbnail = thumbnail Logger.info(this.tag, `takePicture end , thumbnail: ${this.thumbnail}`); } aboutToDisappear() { if (this.mReceiver) { this.mReceiver.release().then(() => { Logger.info(this.tag, 'release succeeded.'); }).catch((error: BusinessError) => { Logger.error(this.tag, `release failed, error: ${error}`); }) } } build() { if (this.isModeBol) { Column() { Text($r('app.string.photo')) .size({ width: $r('app.float.model_size_width'), height: $r('app.float.model_size_height') }) .borderRadius($r('app.float.border_radius')) .fontSize($r('app.float.photo_video_font_size')) .fontColor(Color.White) .onClick(() => { cameraDemo.releaseSession() cameraDemo.initCamera(this.surfaceId, this.settingDataObj.focusMode, this.cameraDeviceIndex) this.modelBagCol = Constants.PHOTO }) }.position({ x: Constants.PHOTO_X_POSITION, y: Constants.Y_POSITION }) Column() { Text($r('app.string.video')) .size({ width: $r('app.float.model_size_width'), height: $r('app.float.model_size_height') }) .borderRadius($r('app.float.border_radius')) .fontSize($r('app.float.photo_video_font_size')) .fontColor(Color.White) .onClick(() => { cameraDemo.releaseSession() cameraDemo.initCamera(this.surfaceId, this.settingDataObj.focusMode, this.cameraDeviceIndex) this.modelBagCol = Constants.VIDEO }) }.position({ x: Constants.VIDEO_X_POSITION, y: Constants.Y_POSITION }) // Album. Column() { Row() { if (this.modelBagCol === Constants.PHOTO) { Image(this.thumbnail || $r('app.media.camera_thumbnail_4x')) .borderRadius(px2vp(Constants.ICON_SIZE / 2)) .syncLoad(true) .objectFit(ImageFit.Fill) .width(px2vp(Constants.ICON_SIZE)) .height(px2vp(Constants.ICON_SIZE)) } else { Image(this.thumbnail || $r('app.media.camera_thumbnail_4x')) .borderRadius(px2vp(Constants.ICON_SIZE / 2)) .objectFit(ImageFit.Fill) .width(px2vp(Constants.ICON_SIZE)) .height(px2vp(Constants.ICON_SIZE)) } } .onClick(() => { if (deviceInfo.deviceType === Constants.DEFAULT) { context.startAbility({ bundleName: 'com.ohos.photos', abilityName: 'com.ohos.photos.MainAbility' }) } else if (deviceInfo.deviceType === Constants.PHONE) { context.startAbility({ bundleName: 'com.huawei.hmos.photos', abilityName: 'com.huawei.hmos.photos.MainAbility' }) } }) } .position({ x: Constants.ALBUM_X_POSITION, y: Constants.ICON_Y_POSITION }) .id('Thumbnail') // Capture video icon. Column() { Row() { if (this.modelBagCol === Constants.PHOTO) { Image($r('app.media.camera_take_photo_4x')) .width(px2vp(Constants.ICON_SIZE)) .height(px2vp(Constants.ICON_SIZE)) .onClick(() => { // Countdown camera recording - default camera recording. this.countTakeVideoFn(); }) } else { Image($r('app.media.camera_take_video_4x')) .width(px2vp(Constants.ICON_SIZE)) .height(px2vp(Constants.ICON_SIZE)) .onClick(() => { // Countdown camera recording - default camera recording. this.countTakeVideoFn(); }) } } }.position({ x: Constants.CAPTURE_X_POSITION, y: Constants.ICON_Y_POSITION }) .id('CaptureOrVideoButton') // Front and rear camera switching. Column() { Row() { Image($r('app.media.camera_switch_4x')) .width(px2vp(Constants.ICON_SIZE)) .height(px2vp(Constants.ICON_SIZE)) .onClick(async () => { // Switching cameras. this.cameraDeviceIndex ? this.cameraDeviceIndex = 0 : this.cameraDeviceIndex = 1; // Clear configuration. cameraDemo.releaseSession(); // Start preview. cameraDemo.initCamera(this.surfaceId, this.settingDataObj.focusMode, this.cameraDeviceIndex); }) } }.position({ x: Constants.SWITCH_X_POSITION, y: Constants.ICON_Y_POSITION }) .id('SwitchCameraButton') } else { Column() { // Video capture button. Image($r('app.media.camera_take_photo_4x')) .width(px2vp(Constants.ICON_SIZE)) .height(px2vp(Constants.ICON_SIZE)) .onClick(() => { cameraDemo.takePictureWithSettings(this.photoSettings); }) }.position({ x: Constants.ALBUM_X_POSITION, y: Constants.ICON_Y_POSITION }) .id('VideoCaptureButton') Column() { Row() { Column() { // video stop button. Image($r('app.media.camera_pause_video_4x')) .size({ width: $r('app.float.video_stop_size'), height: $r('app.float.video_stop_size') }) .width(px2vp(Constants.ICON_SIZE)) .height(px2vp(Constants.ICON_SIZE)) .id('StopVideo') .onClick(() => { if (this.timer) { clearInterval(this.timer); } // Stop video. this.stopVideo().then(() => { this.videoRecodeTime = 0; this.isModeBol = true; }) }) } } .width(px2vp(Constants.ICON_SIZE)) .height(px2vp(Constants.ICON_SIZE)) }.position({ x: Constants.CAPTURE_X_POSITION, y: Constants.ICON_Y_POSITION }) } } async stopVideo() { try { if (this.videoRecorder) { await this.videoRecorder.stop(); await this.videoRecorder.release(); } cameraDemo.videoOutputStopAndRelease(); let result: photoAccessHelper.PhotoAsset | undefined = undefined; if (this.photoAsset) { await fileIo.close(this.fd); setTimeout(async () => { let phAccessHelper = photoAccessHelper.getPhotoAccessHelper(context); let predicates: dataSharePredicates.DataSharePredicates = new dataSharePredicates.DataSharePredicates(); let fetchOptions: photoAccessHelper.FetchOptions = { fetchColumns: [], predicates: predicates }; let fetchResult: photoAccessHelper.FetchResult<photoAccessHelper.PhotoAsset> = await phAccessHelper.getAssets(fetchOptions); let photoAssetList: Array<photoAccessHelper.PhotoAsset> = await fetchResult.getAllObjects(); photoAssetList.forEach((item: photoAccessHelper.PhotoAsset) => { if (item.uri === this.photoAsset) { Logger.info(this.tag, `item.uri = ${item.uri}`) result = item } }) try { // Get video thumbnail. this.thumbnail = await result?.getThumbnail(); Logger.info(this.tag, 'videoThumbnail = ' + JSON.stringify(this.thumbnail)); } catch (err) { Logger.error(this.tag, 'videoThumbnail err----------:' + JSON.stringify(err.message)); } }, 1000) } Logger.info(this.tag, 'stopVideo end'); } catch (err) { Logger.error(this.tag, 'stopVideo err: ' + JSON.stringify(err)); } return; } }
/* * Copyright (c) 2024 Huawei Device Co., Ltd. * Licensed under the Apache License, Version 2.0 (the 'License'); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an 'AS IS' BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ #include <hilog/log.h> #include <js_native_api.h> #include <node_api.h> #include "camera_manager.h" using namespace OHOS_CAMERA_SAMPLE; static NDKCamera *ndkCamera_ = nullptr; const int32_t ARGS_TWO = 2; struct Capture_Setting { int32_t quality; int32_t rotation; int32_t location; bool mirror; int32_t latitude; int32_t longitude; int32_t altitude; }; static napi_value SetZoomRatio(napi_env env, napi_callback_info info) { size_t argc = 2; napi_value args[2] = {nullptr}; napi_value result; napi_get_cb_info(env, info, &argc, args, nullptr, nullptr); napi_valuetype valuetype0; napi_typeof(env, args[0], &valuetype0); int32_t zoomRatio; napi_get_value_int32(env, args[0], &zoomRatio); OH_LOG_INFO(LOG_APP, "SetZoomRatio : %{public}d", zoomRatio); ndkCamera_->setZoomRatioFn(zoomRatio); napi_create_int32(env, argc, &result); return result; } static napi_value HasFlash(napi_env env, napi_callback_info info) { OH_LOG_INFO(LOG_APP, "HasFlash"); size_t argc = 2; napi_value args[2] = {nullptr}; napi_value result; napi_get_cb_info(env, info, &argc, args, nullptr, nullptr); napi_valuetype valuetype0; napi_typeof(env, args[0], &valuetype0); int32_t flashMode; napi_get_value_int32(env, args[0], &flashMode); OH_LOG_INFO(LOG_APP, "HasFlash flashMode : %{public}d", flashMode); ndkCamera_->HasFlashFn(flashMode); napi_create_int32(env, argc, &result); return result; } static napi_value IsVideoStabilizationModeSupported(napi_env env, napi_callback_info info) { OH_LOG_INFO(LOG_APP, "IsVideoStabilizationModeSupportedFn"); size_t argc = 2; napi_value args[2] = {nullptr}; napi_value result; napi_get_cb_info(env, info, &argc, args, nullptr, nullptr); napi_valuetype valuetype0; napi_typeof(env, args[0], &valuetype0); int32_t videoMode; napi_get_value_int32(env, args[0], &videoMode); OH_LOG_INFO(LOG_APP, "IsVideoStabilizationModeSupportedFn videoMode : %{public}d", videoMode); ndkCamera_->IsVideoStabilizationModeSupportedFn(videoMode); napi_create_int32(env, argc, &result); return result; } static napi_value InitCamera(napi_env env, napi_callback_info info) { OH_LOG_INFO(LOG_APP, "InitCamera Start"); size_t argc = 3; napi_value args[3] = {nullptr}; napi_value result; size_t typeLen = 0; char *surfaceId = nullptr; napi_get_cb_info(env, info, &argc, args, nullptr, nullptr); napi_get_value_string_utf8(env, args[0], nullptr, 0, &typeLen); surfaceId = new char[typeLen + 1]; napi_get_value_string_utf8(env, args[0], surfaceId, typeLen + 1, &typeLen); napi_valuetype valuetype1; napi_typeof(env, args[1], &valuetype1); int32_t focusMode; napi_get_value_int32(env, args[1], &focusMode); uint32_t cameraDeviceIndex; napi_get_value_uint32(env, args[ARGS_TWO], &cameraDeviceIndex); OH_LOG_INFO(LOG_APP, "InitCamera focusMode : %{public}d", focusMode); OH_LOG_INFO(LOG_APP, "InitCamera surfaceId : %{public}s", surfaceId); OH_LOG_INFO(LOG_APP, "InitCamera cameraDeviceIndex : %{public}d", cameraDeviceIndex); if (ndkCamera_) { OH_LOG_INFO(LOG_APP, "ndkCamera_ is not null"); delete ndkCamera_; ndkCamera_ = nullptr; } ndkCamera_ = new NDKCamera(surfaceId, focusMode, cameraDeviceIndex); OH_LOG_INFO(LOG_APP, "InitCamera End"); napi_create_int32(env, argc, &result); return result; } static napi_value ReleaseCamera(napi_env env, napi_callback_info info) { OH_LOG_INFO(LOG_APP, "ReleaseCamera Start"); size_t argc = 2; napi_value args[2] = {nullptr}; napi_value result; napi_get_cb_info(env, info, &argc, args, nullptr, nullptr); ndkCamera_->ReleaseCamera(); if (ndkCamera_) { OH_LOG_INFO(LOG_APP, "ndkCamera_ is not null"); delete ndkCamera_; ndkCamera_ = nullptr; } OH_LOG_INFO(LOG_APP, "ReleaseCamera End"); napi_create_int32(env, argc, &result); return result; } static napi_value ReleaseSession(napi_env env, napi_callback_info info) { OH_LOG_INFO(LOG_APP, "ReleaseCamera Start"); size_t argc = 2; napi_value args[2] = {nullptr}; napi_value result; napi_get_cb_info(env, info, &argc, args, nullptr, nullptr); ndkCamera_->ReleaseSession(); OH_LOG_INFO(LOG_APP, "ReleaseCamera End"); napi_create_int32(env, argc, &result); return result; } static napi_value StartPhotoOrVideo(napi_env env, napi_callback_info info) { OH_LOG_INFO(LOG_APP, "StartPhotoOrVideo Start"); Camera_ErrorCode ret = CAMERA_OK; size_t argc = 3; napi_value args[3] = {nullptr}; napi_value result; size_t typeLen = 0; size_t videoIdLen = 0; size_t photoIdLen = 0; char *modeFlag = nullptr; char *videoId = nullptr; char *photoId = nullptr; napi_get_cb_info(env, info, &argc, args, nullptr, nullptr); napi_get_value_string_utf8(env, args[0], nullptr, 0, &typeLen); modeFlag = new char[typeLen + 1]; napi_get_value_string_utf8(env, args[0], modeFlag, typeLen + 1, &typeLen); napi_get_value_string_utf8(env, args[1], nullptr, 0, &videoIdLen); videoId = new char[videoIdLen + 1]; napi_get_value_string_utf8(env, args[1], videoId, videoIdLen + 1, &videoIdLen); napi_get_value_string_utf8(env, args[ARGS_TWO], nullptr, 0, &photoIdLen); photoId = new char[photoIdLen + 1]; napi_get_value_string_utf8(env, args[ARGS_TWO], photoId, photoIdLen + 1, &photoIdLen); if (!strcmp(modeFlag, "photo")) { OH_LOG_INFO(LOG_APP, "StartPhoto surfaceId %{public}s", photoId); ret = ndkCamera_->StartPhoto(photoId); } else if (!strcmp(modeFlag, "video")) { ret = ndkCamera_->StartVideo(videoId, photoId); OH_LOG_INFO(LOG_APP, "StartPhotoOrVideo %{public}s, %{public}s", videoId, photoId); } napi_create_int32(env, ret, &result); return result; } static napi_value VideoOutputStart(napi_env env, napi_callback_info info) { if (info == nullptr) { OH_LOG_ERROR(LOG_APP, "Info is nullptr."); } OH_LOG_INFO(LOG_APP, "VideoOutputStart Start"); napi_value result; Camera_ErrorCode ret = ndkCamera_->VideoOutputStart(); napi_create_int32(env, ret, &result); return result; } static napi_value IsExposureModeSupported(napi_env env, napi_callback_info info) { OH_LOG_INFO(LOG_APP, "IsExposureModeSupported exposureMode start."); size_t argc = 2; napi_value args[2] = {nullptr}; napi_value result; napi_get_cb_info(env, info, &argc, args, nullptr, nullptr); napi_valuetype valuetype0; napi_typeof(env, args[0], &valuetype0); int32_t exposureMode; napi_get_value_int32(env, args[0], &exposureMode); OH_LOG_INFO(LOG_APP, "IsExposureModeSupported exposureMode : %{public}d", exposureMode); ndkCamera_->IsExposureModeSupportedFn(exposureMode); OH_LOG_INFO(LOG_APP, "IsExposureModeSupported exposureMode end."); napi_create_int32(env, argc, &result); return result; } static napi_value IsMeteringPoint(napi_env env, napi_callback_info info) { size_t argc = 2; napi_value args[2] = {nullptr}; napi_value result; napi_get_cb_info(env, info, &argc, args, nullptr, nullptr); napi_valuetype valuetype0; napi_typeof(env, args[0], &valuetype0); int x; napi_get_value_int32(env, args[0], &x); napi_typeof(env, args[0], &valuetype0); int y; napi_get_value_int32(env, args[1], &y); ndkCamera_->IsMeteringPoint(x, y); napi_create_int32(env, argc, &result); return result; } static napi_value IsExposureBiasRange(napi_env env, napi_callback_info info) { OH_LOG_INFO(LOG_APP, "IsExposureBiasRange start."); size_t argc = 2; napi_value args[2] = {nullptr}; napi_value result; napi_get_cb_info(env, info, &argc, args, nullptr, nullptr); napi_valuetype valuetype0; napi_typeof(env, args[0], &valuetype0); int exposureBiasValue; napi_get_value_int32(env, args[0], &exposureBiasValue); ndkCamera_->IsExposureBiasRange(exposureBiasValue); OH_LOG_INFO(LOG_APP, "IsExposureBiasRange end."); napi_create_int32(env, argc, &result); return result; } static napi_value IsFocusModeSupported(napi_env env, napi_callback_info info) { OH_LOG_INFO(LOG_APP, "IsFocusModeSupported start."); size_t argc = 2; napi_value args[2] = {nullptr}; napi_value result; napi_get_cb_info(env, info, &argc, args, nullptr, nullptr); napi_valuetype valuetype0; napi_typeof(env, args[0], &valuetype0); int32_t focusMode; napi_get_value_int32(env, args[0], &focusMode); OH_LOG_INFO(LOG_APP, "IsFocusModeSupportedFn videoMode : %{public}d", focusMode); ndkCamera_->IsFocusModeSupported(focusMode); OH_LOG_INFO(LOG_APP, "IsFocusModeSupported end."); napi_create_int32(env, argc, &result); return result; } static napi_value IsFocusPoint(napi_env env, napi_callback_info info) { size_t argc = 2; napi_value args[2] = {nullptr}; napi_value result; napi_get_cb_info(env, info, &argc, args, nullptr, nullptr); napi_valuetype valuetype0; napi_typeof(env, args[0], &valuetype0); double x; napi_get_value_double(env, args[0], &x); napi_valuetype valuetype1; napi_typeof(env, args[1], &valuetype1); double y; napi_get_value_double(env, args[1], &y); float focusPointX = static_cast<float>(x); float focusPointY = static_cast<float>(y); ndkCamera_->IsFocusPoint(focusPointX, focusPointY); napi_create_int32(env, argc, &result); return result; } static napi_value GetVideoFrameWidth(napi_env env, napi_callback_info info) { OH_LOG_INFO(LOG_APP, "GetVideoFrameWidth Start"); size_t argc = 1; napi_value args[1] = {nullptr}; napi_get_cb_info(env, info, &argc, args, nullptr, nullptr); napi_value result = nullptr; napi_create_int32(env, ndkCamera_->GetVideoFrameWidth(), &result); OH_LOG_INFO(LOG_APP, "GetVideoFrameWidth End"); return result; } static napi_value GetVideoFrameHeight(napi_env env, napi_callback_info info) { OH_LOG_INFO(LOG_APP, "GetVideoFrameHeight Start"); size_t argc = 1; napi_value args[1] = {nullptr}; napi_get_cb_info(env, info, &argc, args, nullptr, nullptr); napi_value result = nullptr; napi_create_int32(env, ndkCamera_->GetVideoFrameHeight(), &result); OH_LOG_INFO(LOG_APP, "GetVideoFrameHeight End"); return result; } static napi_value GetVideoFrameRate(napi_env env, napi_callback_info info) { OH_LOG_INFO(LOG_APP, "GetVideoFrameRate Start"); size_t argc = 1; napi_value args[1] = {nullptr}; napi_get_cb_info(env, info, &argc, args, nullptr, nullptr); napi_value result = nullptr; napi_create_int32(env, ndkCamera_->GetVideoFrameRate(), &result); OH_LOG_INFO(LOG_APP, "GetVideoFrameRate End"); return result; } static napi_value VideoOutputStopAndRelease(napi_env env, napi_callback_info info) { OH_LOG_INFO(LOG_APP, "VideoOutputStopAndRelease Start"); size_t argc = 1; napi_value args[1] = {nullptr}; napi_get_cb_info(env, info, &argc, args, nullptr, nullptr); napi_value result = nullptr; ndkCamera_->VideoOutputStop(); ndkCamera_->VideoOutputRelease(); OH_LOG_INFO(LOG_APP, "VideoOutputStopAndRelease End"); napi_create_int32(env, argc, &result); return result; } static napi_value TakePicture(napi_env env, napi_callback_info info) { if (info == nullptr) { OH_LOG_ERROR(LOG_APP, "Info is nullptr."); } OH_LOG_INFO(LOG_APP, "TakePicture Start"); napi_value result; Camera_ErrorCode ret = ndkCamera_->TakePicture(); OH_LOG_INFO(LOG_APP, "TakePicture result is %{public}d", ret); napi_create_int32(env, ret, &result); return result; } static napi_value GetCaptureParam(napi_env env, napi_value captureConfigValue, Capture_Setting *config) { napi_value value = nullptr; napi_get_named_property(env, captureConfigValue, "quality", &value); napi_get_value_int32(env, value, &config->quality); napi_get_named_property(env, captureConfigValue, "rotation", &value); napi_get_value_int32(env, value, &config->rotation); napi_get_named_property(env, captureConfigValue, "mirror", &value); napi_get_value_bool(env, value, &config->mirror); napi_get_named_property(env, captureConfigValue, "latitude", &value); napi_get_value_int32(env, value, &config->latitude); napi_get_named_property(env, captureConfigValue, "longitude", &value); napi_get_value_int32(env, value, &config->longitude); napi_get_named_property(env, captureConfigValue, "altitude", &value); napi_get_value_int32(env, value, &config->altitude); OH_LOG_INFO(LOG_APP, "get quality %{public}d, rotation %{public}d, mirror %{public}d, latitude " "%{public}d, longitude %{public}d, altitude %{public}d", config->quality, config->rotation, config->mirror, config->latitude, config->longitude, config->altitude); return 0; } static void SetConfig(Capture_Setting settings, Camera_PhotoCaptureSetting *photoSetting, Camera_Location *location) { if (photoSetting == nullptr || location == nullptr) { OH_LOG_INFO(LOG_APP, "photoSetting is null"); } photoSetting->quality = static_cast<Camera_QualityLevel>(settings.quality); photoSetting->rotation = static_cast<Camera_ImageRotation>(settings.rotation); photoSetting->mirror = settings.mirror; location->altitude = settings.altitude; location->latitude = settings.latitude; location->longitude = settings.longitude; photoSetting->location = location; } static napi_value TakePictureWithSettings(napi_env env, napi_callback_info info) { OH_LOG_INFO(LOG_APP, "TakePictureWithSettings Start"); size_t argc = 1; napi_value args[1] = {nullptr}; Camera_PhotoCaptureSetting photoSetting; Capture_Setting setting_inner; Camera_Location *location = new Camera_Location; napi_get_cb_info(env, info, &argc, args, nullptr, nullptr); GetCaptureParam(env, args[0], &setting_inner); SetConfig(setting_inner, &photoSetting, location); napi_value result; Camera_ErrorCode ret = ndkCamera_->TakePictureWithPhotoSettings(photoSetting); OH_LOG_INFO(LOG_APP, "TakePictureWithSettings result is %{public}d", ret); napi_create_int32(env, ret, &result); return result; } EXTERN_C_START static napi_value Init(napi_env env, napi_value exports) { napi_property_descriptor desc[] = { {"initCamera", nullptr, InitCamera, nullptr, nullptr, nullptr, napi_default, nullptr}, {"startPhotoOrVideo", nullptr, StartPhotoOrVideo, nullptr, nullptr, nullptr, napi_default, nullptr}, {"videoOutputStart", nullptr, VideoOutputStart, nullptr, nullptr, nullptr, napi_default, nullptr}, {"setZoomRatio", nullptr, SetZoomRatio, nullptr, nullptr, nullptr, napi_default, nullptr}, {"hasFlash", nullptr, HasFlash, nullptr, nullptr, nullptr, napi_default, nullptr}, {"isVideoStabilizationModeSupported", nullptr, IsVideoStabilizationModeSupported, nullptr, nullptr, nullptr, napi_default, nullptr}, {"isExposureModeSupported", nullptr, IsExposureModeSupported, nullptr, nullptr, nullptr, napi_default, nullptr}, {"isMeteringPoint", nullptr, IsMeteringPoint, nullptr, nullptr, nullptr, napi_default, nullptr}, {"isExposureBiasRange", nullptr, IsExposureBiasRange, nullptr, nullptr, nullptr, napi_default, nullptr}, {"IsFocusModeSupported", nullptr, IsFocusModeSupported, nullptr, nullptr, nullptr, napi_default, nullptr}, {"isFocusPoint", nullptr, IsFocusPoint, nullptr, nullptr, nullptr, napi_default, nullptr}, {"getVideoFrameWidth", nullptr, GetVideoFrameWidth, nullptr, nullptr, nullptr, napi_default, nullptr}, {"getVideoFrameHeight", nullptr, GetVideoFrameHeight, nullptr, nullptr, nullptr, napi_default, nullptr}, {"getVideoFrameRate", nullptr, GetVideoFrameRate, nullptr, nullptr, nullptr, napi_default, nullptr}, {"videoOutputStopAndRelease", nullptr, VideoOutputStopAndRelease, nullptr, nullptr, nullptr, napi_default, nullptr}, {"takePicture", nullptr, TakePicture, nullptr, nullptr, nullptr, napi_default, nullptr}, {"takePictureWithSettings", nullptr, TakePictureWithSettings, nullptr, nullptr, nullptr, napi_default, nullptr}, {"releaseSession", nullptr, ReleaseSession, nullptr, nullptr, nullptr, napi_default, nullptr}, {"releaseCamera", nullptr, ReleaseCamera, nullptr, nullptr, nullptr, napi_default, nullptr}}; napi_define_properties(env, exports, sizeof(desc) / sizeof(desc[0]), desc); return exports; } EXTERN_C_END static napi_module demoModule = { .nm_version = 1, .nm_flags = 0, .nm_filename = nullptr, .nm_register_func = Init, .nm_modname = "entry", .nm_priv = ((void *)0), .reserved = {0}, }; extern "C" __attribute__((constructor)) void RegisterEntryModule(void) { napi_module_register(&demoModule); }
预览:开启预览位于Index.ets下的onPageShow接口,其中调用cameraDemo.initCamera接口,将预览的surfaceId,对焦模式的值,以及是前置还是后置摄像头设备作为入参啊传下去,实际调用的是main.cpp下的InitCamera接口,InitCamera接口将JS侧拿到的参数进行转换再传入cameraManager.cpp中的构造函数里去,完成开启相机的操作,开启预览并设置好对焦模式。
相机对焦、曝光功能实现调用侧位于FocusAreaPage.ets中,源码参考:[FocusAreaPage.ets]
/* * Copyright (c) 2024 Huawei Device Co., Ltd. * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ import cameraDemo from 'libentry.so'; import Logger from '../common/utils/Logger'; import { Constants } from '../common/Constants' const TAG: string = 'FocusAreaPage'; // Focus Area. @Component export struct FocusAreaPage { @Link focusPointBol: boolean; @Link focusPointVal: Array<number>; // Display where scale, focal length value, and focus box cannot coexist. @Link exposureBol: boolean; // Exposure value. @Link exposureNum: number; @Prop xComponentWidth: number; @Prop xComponentHeight: number; // Focusing area display box timer. private areaTimer: number = -1; // Sliding Exposure Up and Down. private panOption: PanGestureOptions = new PanGestureOptions({ direction: PanDirection.Up | PanDirection.Down, fingers: 1 }); build() { Row() { } .width(Constants.FULL_PERCENT) .height(Constants.SEVENTY_PERCENT) .margin({ bottom: Constants.FIFTEEN_PERCENT }) .opacity(1) .id('FocusArea') .onTouch((e: TouchEvent) => { if (e.type === TouchType.Down) { this.focusPointBol = true; this.focusPointVal[0] = e.touches[0].windowX; this.focusPointVal[1] = e.touches[0].windowY; // Focus point. cameraDemo.isFocusPoint( e.touches[0].windowX / this.xComponentWidth, e.touches[0].windowY / this.xComponentHeight ); cameraDemo.isMeteringPoint( e.touches[0].windowX / this.xComponentWidth, e.touches[0].windowY / this.xComponentHeight + 50 ); } if (e.type === TouchType.Up) { if (this.areaTimer) { clearTimeout(this.areaTimer); } this.areaTimer = setTimeout(() => { this.focusPointBol = false; }, 3500); } }) // Trigger this gesture event by dragging vertically with one finger. .gesture( PanGesture(this.panOption) .onActionStart(() => { Logger.info(TAG, 'PanGesture onActionStart'); this.exposureBol = false; }) .onActionUpdate((event: GestureEvent) => { let offset = -event.offsetY; if (offset > Constants.EVENT_Y_OFFSET) { this.exposureNum = 4; } if (offset < Constants.EVENT_Y_OFFSET1) { this.exposureNum = -4; } if (offset > Constants.EVENT_Y_OFFSET1 && offset < Constants.EVENT_Y_OFFSET) { this.exposureNum = Number((offset / 50).toFixed(1)); } // Exposure Compensation -4 +4. cameraDemo.isExposureBiasRange(this.exposureNum); Logger.info(TAG, `PanGesture onActionUpdate offset: ${offset}, exposureNum: ${this.exposureNum}`); }) .onActionEnd(() => { this.exposureNum = 0; this.exposureBol = true; Logger.info(TAG, 'PanGesture onActionEnd end'); }) ) } }
以上就是本篇文章所带来的鸿蒙开发中一小部分技术讲解;想要学习完整的鸿蒙全栈技术。可以在结尾找我可全部拿到!
下面是鸿蒙的完整学习路线,展示如下:
除此之外,根据这个学习鸿蒙全栈学习路线,也附带一整套完整的学习【文档+视频】,内容包含如下:
内容包含了:(ArkTS、ArkUI、Stage模型、多端部署、分布式应用开发、音频、视频、WebGL、OpenHarmony多媒体技术、Napi组件、OpenHarmony内核、鸿蒙南向开发、鸿蒙项目实战)等技术知识点。帮助大家在学习鸿蒙路上快速成长!
为了避免大家在学习过程中产生更多的时间成本,对比我把以上内容全部放在了↓↓↓想要的可以自拿喔!谢谢大家观看!
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。