首先我应该提到我是这个(iOS 和 Xamarin)的新手。 我正在尝试使用图像和文本为视频加水印。
我已经从链接移植了大部分代码:https://stackoverflow.com/a/22016800/5275669 .
全部代码贴在这里:
try {
var videoAsset = AVUrlAsset.FromUrl (new NSUrl (filepath, false)) as AVUrlAsset;
AVMutableComposition mixComposition = AVMutableComposition.Create ();
var compositionVideoTracks = mixComposition.AddMutableTrack (AVMediaType.Video, 0);
AVAssetTrack clipVideoTrack = videoAsset.TracksWithMediaType (AVMediaType.Video) [0];
var compositionAudioTrack = mixComposition.AddMutableTrack (AVMediaType.Audio, 0);
AVAssetTrack clipAudioTrack = videoAsset.TracksWithMediaType (AVMediaType.Audio) [0];
NSError error;
CMTimeRange timeRangeInAsset = new CMTimeRange ();
timeRangeInAsset.Start = CMTime.Zero;
timeRangeInAsset.Duration = videoAsset.Duration;
compositionVideoTracks.InsertTimeRange (timeRangeInAsset, clipVideoTrack, CMTime.Zero, out error);
compositionVideoTracks.InsertTimeRange (timeRangeInAsset, clipAudioTrack, CMTime.Zero, out error);
compositionVideoTracks.PreferredTransform = clipVideoTrack.PreferredTransform;
CGSize sizeOfVideo = videoAsset.NaturalSize;
CATextLayer textOfvideo = (CATextLayer)CATextLayer.Create ();
textOfvideo.String = String.Format ("{0} {1}", DateTime.Now.ToLongTimeString (), "Test app");
textOfvideo.SetFont (CGFont.CreateWithFontName ("Helvetica"));
textOfvideo.FontSize = 50;
textOfvideo.AlignmentMode = CATextLayer.AlignmentCenter;
textOfvideo.Frame = new CGRect (0, 0, sizeOfVideo.Width, sizeOfVideo.Height / 6);
textOfvideo.ForegroundColor = new CGColor (255, 0, 0);
UIImage myImage = UIImage.FromFile ("Icon-Small.png");
CALayer layerCa = CALayer.Create ();
layerCa.Contents = myImage.CGImage;
layerCa.Frame = new CGRect (0, 0, 100, 100);
layerCa.Opacity = 0.65F;
CALayer optionalLayer = CALayer.Create ();
optionalLayer.AddSublayer (textOfvideo);
optionalLayer.Frame = new CGRect (0, 0, sizeOfVideo.Width, sizeOfVideo.Height);
optionalLayer.MasksToBounds = true;
CALayer parentLayer = CALayer.Create ();
CALayer videoLayer = CALayer.Create ();
parentLayer.Frame = new CGRect (0, 0, sizeOfVideo.Width, sizeOfVideo.Height);
videoLayer.Frame = new CGRect (0, 0, sizeOfVideo.Width, sizeOfVideo.Height);
parentLayer.AddSublayer (videoLayer);
parentLayer.AddSublayer (layerCa);
parentLayer.AddSublayer (textOfvideo);
AVMutableVideoComposition videoComposition = AVMutableVideoComposition.Create ();
videoComposition.RenderSize = sizeOfVideo;
videoComposition.FrameDuration = new CMTime (1, 30);
videoComposition.AnimationTool = AVVideoCompositionCoreAnimationTool.FromLayer (videoLayer, parentLayer);
AVMutableVideoCompositionInstruction instruction = AVMutableVideoCompositionInstruction.Create () as AVMutableVideoCompositionInstruction;
CMTimeRange timeRangeInstruction = new CMTimeRange ();
timeRangeInstruction.Start = CMTime.Zero;
timeRangeInstruction.Duration = mixComposition.Duration;
instruction.TimeRange = timeRangeInstruction;
AVAssetTrack videoTrack = mixComposition.TracksWithMediaType (AVMediaType.Video) [0];
AVMutableVideoCompositionLayerInstruction layerInstruction = AVMutableVideoCompositionLayerInstruction.FromAssetTrack (videoTrack);
instruction.LayerInstructions = new AVVideoCompositionLayerInstruction[] { layerInstruction };
List<AVVideoCompositionInstruction> instructions = new List<AVVideoCompositionInstruction> ();
instructions.Add (instruction);
videoComposition.Instructions = instructions.ToArray ();
var exportSession = new AVAssetExportSession (mixComposition, AVAssetExportSession.PresetMediumQuality);
exportSession.VideoComposition = videoComposition;
Console.WriteLine ("Original path is {0}", filepath);
string newFileName = Path.GetFileName (filepath);
newFileName = newFileName.Replace (".mp4", "_wm.mp4");
string directoryName = Path.GetDirectoryName (filepath);
string videoOutFilePath = Path.Combine (directoryName, newFileName);
Console.WriteLine ("New path is {0}", videoOutFilePath);
exportSession.OutputFileType = AVFileType.Mpeg4;
exportSession.OutputUrl = NSUrl.FromFilename (videoOutFilePath);
exportSession.ShouldOptimizeForNetworkUse = true;
exportSession.ExportAsynchronously (() => {
AVAssetExportSessionStatus status = exportSession.Status;
Console.WriteLine ("Done with handler. Status: " + status.ToString ());
switch (status) {
case AVAssetExportSessionStatus.Completed:
Console.WriteLine ("Sucessfully Completed");
if (File.Exists (videoOutFilePath)) {
Console.WriteLine ("Created!!");
} else
Console.WriteLine ("Failed");
break;
case AVAssetExportSessionStatus.Cancelled:
break;
case AVAssetExportSessionStatus.Exporting:
break;
case AVAssetExportSessionStatus.Failed:
Console.WriteLine ("Task failed => {0}", exportSession.Error);
Console.WriteLine (exportSession.Error.Description);
break;
case AVAssetExportSessionStatus.Unknown:
break;
case AVAssetExportSessionStatus.Waiting:
break;
default:
break;
}
});
if (File.Exists (videoOutFilePath))
return videoOutFilePath;
} catch (Exception ex) {
Console.WriteLine ("Error occured : {0}", ex.Message);
}
我不断收到以下错误: 任务失败 => 无法完成导出 错误域=AVFoundationErrorDomain 代码=-11820“无法完成导出”UserInfo=0x18896070 {NSLocalizedRecoverySuggestion=再次尝试导出。, NSLocalizedDescription=无法完成导出}
如果我替换这个
var exportSession = new AVAssetExportSession (mixComposition, AVAssetExportSession.PresetMediumQuality);
与
var exportSession = new AVAssetExportSession (videoAsset, AVAssetExportSession.PresetMediumQuality);
可以正常使用,但没有水印
有人可以帮忙吗??
最佳答案
所以我设法通过更改这些行来解决这个问题
compositionVideoTracks.InsertTimeRange (timeRangeInAsset, clipVideoTrack, CMTime.Zero, out error);
compositionVideoTracks.InsertTimeRange (timeRangeInAsset, clipAudioTrack, CMTime.Zero, out error);
compositionVideoTracks.PreferredTransform = clipVideoTrack.PreferredTransform;
为此:
compositionVideoTracks.InsertTimeRange (timeRangeInAsset, clipVideoTrack, CMTime.Zero, out error);
compositionAudioTrack.InsertTimeRange (timeRangeInAsset, clipAudioTrack, CMTime.Zero, out error);
compositionVideoTracks.PreferredTransform = clipVideoTrack.PreferredTransform;
虽然我不确定为什么会这样。谁能解释一下?
关于ios - Xamarin.iOS - ExportAsynchronously 失败,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/33929938/
我已经构建了一些serverspec代码来在多个主机上运行一组测试。问题是当任何测试失败时,测试会在当前主机停止。即使测试失败,我也希望它继续在所有主机上运行。Rakefile:namespace:specdotask:all=>hosts.map{|h|'spec:'+h.split('.')[0]}hosts.eachdo|host|begindesc"Runserverspecto#{host}"RSpec::Core::RakeTask.new(host)do|t|ENV['TARGET_HOST']=hostt.pattern="spec/cfengine3/*_spec.r
这里有一个很好的答案解释了如何在Ruby中下载文件而不将其加载到内存中:https://stackoverflow.com/a/29743394/4852737require'open-uri'download=open('http://example.com/image.png')IO.copy_stream(download,'~/image.png')我如何验证下载文件的IO.copy_stream调用是否真的成功——这意味着下载的文件与我打算下载的文件完全相同,而不是下载一半的损坏文件?documentation说IO.copy_stream返回它复制的字节数,但是当我还没有下
我正在尝试解析一个文本文件,该文件每行包含可变数量的单词和数字,如下所示:foo4.500bar3.001.33foobar如何读取由空格而不是换行符分隔的文件?有什么方法可以设置File("file.txt").foreach方法以使用空格而不是换行符作为分隔符? 最佳答案 接受的答案将slurp文件,这可能是大文本文件的问题。更好的解决方案是IO.foreach.它是惯用的,将按字符流式传输文件:File.foreach(filename,""){|string|putsstring}包含“thisisanexample”结果的
1.错误信息:Errorresponsefromdaemon:Gethttps://registry-1.docker.io/v2/:net/http:requestcanceledwhilewaitingforconnection(Client.Timeoutexceededwhileawaitingheaders)或者:Errorresponsefromdaemon:Gethttps://registry-1.docker.io/v2/:net/http:TLShandshaketimeout2.报错原因:docker使用的镜像网址默认为国外,下载容易超时,需要修改成国内镜像地址(首先阿里
我正在尝试在Rails上安装ruby,到目前为止一切都已安装,但是当我尝试使用rakedb:create创建数据库时,我收到一个奇怪的错误:dyld:lazysymbolbindingfailed:Symbolnotfound:_mysql_get_client_infoReferencedfrom:/Library/Ruby/Gems/1.8/gems/mysql2-0.3.11/lib/mysql2/mysql2.bundleExpectedin:flatnamespacedyld:Symbolnotfound:_mysql_get_client_infoReferencedf
我需要一个非常简单的字符串验证器来显示第一个符号与所需格式不对应的位置。我想使用正则表达式,但在这种情况下,我必须找到与表达式相对应的字符串停止的位置,但我找不到可以做到这一点的方法。(这一定是一种相当简单的方法……也许没有?)例如,如果我有正则表达式:/^Q+E+R+$/带字符串:"QQQQEEE2ER"期望的结果应该是7 最佳答案 一个想法:你可以做的是标记你的模式并用可选的嵌套捕获组编写它:^(Q+(E+(R+($)?)?)?)?然后你只需要计算你获得的捕获组的数量就可以知道正则表达式引擎在模式中停止的位置,你可以确定匹配结束
我正在尝试在配备ARMv7处理器的SynologyDS215j上安装ruby2.2.4或2.3.0。我用了optware-ng安装gcc、make、openssl、openssl-dev和zlib。我根据README中的说明安装了rbenv(版本1.0.0-19-g29b4da7)和ruby-build插件。.这些是随optware-ng安装的软件包及其版本binutils-2.25.1-1gcc-5.3.0-6gconv-modules-2.21-3glibc-opt-2.21-4libc-dev-2.21-1libgmp-6.0.0a-1libmpc-1.0.2-1libm
一段时间以来,我一直在使用open_uri下拉ftp路径作为数据源,但突然发现我几乎连续不断地收到“530抱歉,允许的最大客户端数(95)已经连接。”我不确定我的代码是否有问题,或者是否是其他人在访问服务器,不幸的是,我无法真正确定谁有问题。本质上,我正在读取FTPURI:defself.read_uri(uri)beginuri=open(uri).readuri=="Error"?nil:urirescueOpenURI::HTTPErrornilendend我猜我需要在这里添加一些额外的错误处理代码...我想确保我采取一切预防措施来关闭所有连接,这样我的连接就不是问题所在,但是我
print"Enteryourpassword:"pass=STDIN.noecho(&:gets)puts"Yourpasswordis#{pass}!"输出:Enteryourpassword:input.rb:2:in`':undefinedmethod`noecho'for#>(NoMethodError) 最佳答案 一开始require'io/console'后来的Ruby1.9.3 关于ruby-为什么不能使用类IO的实例方法noecho?,我们在StackOverflow上
我在思考流量控制的最佳实践。我应该走哪条路?1)不要检查任何东西并让程序失败(更清晰的代码,自然的错误消息):defself.fetch(feed_id)feed=Feed.find(feed_id)feed.fetchend2)通过返回nil静默失败(但是,“CleanCode”说,你永远不应该返回null):defself.fetch(feed_id)returnunlessfeed_idfeed=Feed.find(feed_id)returnunlessfeedfeed.fetchend3)抛出异常(因为不按id查找feed是异常的):defself.fetch(feed_id