Spark 1.3.1上的Apache Phoenix(4.3.1和4.4.0-HBase-0.98)ClassNotFoundException

耶隆·弗莱克(Jeroen Vlek)

我正在尝试通过Spark连接到Phoenix,通过JDBC驱动程序打开连接时,我不断收到以下异常(为简洁起见,下面为完整的stacktrace):

Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.ipc.controller.ClientRpcControllerFactory
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)

有问题的类由名为phoenix-core-4.3.1.jar的jar提供(尽管它位于HBase包名称空间中,我想他们需要将其与HBase集成)。

因此,关于Spark上的ClassNotFoundExceptions的问题很多,我尝试了胖罐方法(都使用了Maven的汇编和阴影插件;我已经检查了罐子,它们确实包含ClientRpcControllerFactory),并且我尝试了瘦罐子在命令行上指定jar时。对于后者,我使用的命令如下:

/opt/mapr/spark/spark-1.3.1/bin/spark-submit --jars lib/spark-streaming-kafka_10-1.3.1.jar,lib/kafka_2.10-0.8.1.1.jar,lib/zkclient-0.3.jar,lib/metrics-core-3.1.0.jar,lib/metrics-core-2.2.0.jar,lib/phoenix-core-4.3.1.jar --class nl.work.kafkastreamconsumer.phoenix.KafkaPhoenixConnector KafkaStreamConsumer.jar node1:5181 0 topic jdbc:phoenix:node1:5181 true

我还从代码内完成了类路径转储,并且层次结构中的第一个类加载器已经知道Phoenix罐了:

2015-06-04 10:52:34,323 [Executor task launch worker-1] INFO  nl.work.kafkastreamconsumer.phoenix.LinePersister - [file:/home/work/projects/customer/KafkaStreamConsumer.jar, file:/home/work/projects/customer/lib/spark-streaming-kafka_2.10-1.3.1.jar, file:/home/work/projects/customer/lib/kafka_2.10-0.8.1.1.jar, file:/home/work/projects/customer/lib/zkclient-0.3.jar, file:/home/work/projects/customer/lib/metrics-core-3.1.0.jar, file:/home/work/projects/customer/lib/metrics-core-2.2.0.jar, file:/home/work/projects/customer/lib/phoenix-core-4.3.1.jar]

所以问题是:我在这里想念什么?为什么Spark无法加载正确的类?该类应该只有一个版本在运行(即来自phoenix-core的版本),因此我怀疑这是版本冲突。

[Executor task launch worker-3] ERROR nl.work.kafkastreamconsumer.phoenix.LinePersister - Error while processing line
java.lang.RuntimeException: java.sql.SQLException: ERROR 103 (08004): Unable to establish connection.
        at nl.work.kafkastreamconsumer.phoenix.PhoenixConnection.<init>(PhoenixConnection.java:41)
        at nl.work.kafkastreamconsumer.phoenix.LinePersister$1.call(LinePersister.java:40)
        at nl.work.kafkastreamconsumer.phoenix.LinePersister$1.call(LinePersister.java:32)
        at org.apache.spark.api.java.JavaPairRDD$$anonfun$toScalaFunction$1.apply(JavaPairRDD.scala:999)
        at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
        at scala.collection.Iterator$class.foreach(Iterator.scala:727)
        at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
        at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
        at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
        at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)
        at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)
        at scala.collection.AbstractIterator.to(Iterator.scala:1157)
        at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)
        at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)
        at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)
        at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)
        at org.apache.spark.rdd.RDD$$anonfun$17.apply(RDD.scala:813)
        at org.apache.spark.rdd.RDD$$anonfun$17.apply(RDD.scala:813)
        at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1498)
        at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1498)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
        at org.apache.spark.scheduler.Task.run(Task.scala:64)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.sql.SQLException: ERROR 103 (08004): Unable to establish connection.
        at org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:362)
        at org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:133)
        at org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:282)
        at org.apache.phoenix.query.ConnectionQueryServicesImpl.access$300(ConnectionQueryServicesImpl.java:166)
        at org.apache.phoenix.query.ConnectionQueryServicesImpl$11.call(ConnectionQueryServicesImpl.java:1831)
        at org.apache.phoenix.query.ConnectionQueryServicesImpl$11.call(ConnectionQueryServicesImpl.java:1810)
        at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:77)
        at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1810)
        at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:162)
        at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:126)
        at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:133)
        at java.sql.DriverManager.getConnection(DriverManager.java:571)
        at java.sql.DriverManager.getConnection(DriverManager.java:233)
        at nl.work.kafkastreamconsumer.phoenix.PhoenixConnection.<init>(PhoenixConnection.java:39)
        ... 25 more
Caused by: java.io.IOException: java.lang.reflect.InvocationTargetException
        at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:457)
        at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:350)
        at org.apache.phoenix.query.HConnectionFactory$HConnectionFactoryImpl.createConnection(HConnectionFactory.java:47)
        at org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:280)
        ... 36 more
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.GeneratedConstructorAccessor8.newInstance(Unknown Source)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:455)
        ... 39 more
Caused by: java.lang.UnsupportedOperationException: Unable to find org.apache.hadoop.hbase.ipc.controller.ClientRpcControllerFactory
        at org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:36)
        at org.apache.hadoop.hbase.ipc.RpcControllerFactory.instantiate(RpcControllerFactory.java:56)
        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:769)
        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:689)
        ... 43 more
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.ipc.controller.ClientRpcControllerFactory
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:191)
        at org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:32)
        ... 46 more

/编辑

不幸的是,问题仍然存在于4.4.0-HBase-0.98中。以下是有问题的课程。由于saveToPhoenix()方法尚不适用于Java API,并且由于这只是一个POC,因此我的想法是为每个微型批处理简单地使用JDBC驱动程序。

public class PhoenixConnection implements AutoCloseable, Serializable {
    private static final long serialVersionUID = -4491057264383873689L;
    private static final String PHOENIX_DRIVER = "org.apache.phoenix.jdbc.PhoenixDriver";

    static {
        try {
            Class.forName(PHOENIX_DRIVER);
        } catch (ClassNotFoundException e) {
            throw new RuntimeException(e);
        }
    }

    private Connection connection;

    public PhoenixConnection(final String jdbcUri) {

        try {
            connection = DriverManager.getConnection(jdbcUri);
        } catch (SQLException e) {
            throw new RuntimeException(e);
        }
    }

    public List<Map<String, Object>> executeQuery(final String sql) throws SQLException {

        ArrayList<Map<String, Object>> resultList = new ArrayList<>();
        try (PreparedStatement statement = connection.prepareStatement(sql); ResultSet resultSet = statement.executeQuery() ) {
            ResultSetMetaData metaData = resultSet.getMetaData();
            while (resultSet.next()) {
                Map<String, Object> row = new HashMap<>(metaData.getColumnCount());
                for (int column = 0; column < metaData.getColumnCount(); ++column) {
                    final String columnLabel = metaData.getColumnLabel(column);
                    row.put(columnLabel, resultSet.getObject(columnLabel));
                }
            }
        }
        resultList.trimToSize();

        return resultList;
    }

    @Override
    public void close() {
        try {
            connection.close();
        } catch (SQLException e) {
            throw new RuntimeException(e);
        }
    }

}

public class LinePersister implements Function<JavaRDD<String>, Void> {
    private static final long serialVersionUID = -2529724617108874989L;
    private static final Logger LOGGER = Logger.getLogger(LinePersister.class);
    private static final String TABLE_NAME = "mail_events";

    private final String jdbcUrl;

    public LinePersister(String jdbcUrl) {
        this.jdbcUrl = jdbcUrl;
    }



    @Override
    public Void call(JavaRDD<String> dataSet) throws Exception {
        LOGGER.info(String.format(
                "Starting conversion on rdd with %d elements", dataSet.count()));

        List<Void> collectResult = dataSet.map(new Function<String, Void>() {

            private static final long serialVersionUID = -6651313541439109868L;

            @Override
            public Void call(String line) throws Exception {
                LOGGER.info("Writing line " + line);
                Event event = EventParser.parseLine(line);
                try (PhoenixConnection connection = new PhoenixConnection(
                        jdbcUrl)) {
                    connection.executeQuery(event
                            .createUpsertStatement(TABLE_NAME));
                } catch (Exception e) {
                    LOGGER.error("Error while processing line", e);
                    dumpClasspath(this.getClass().getClassLoader());

                }
                return null;
            }
        }).collect();

        LOGGER.info(String.format("Got %d results: ", collectResult.size()));

        return null;
    }

    public static void dumpClasspath(ClassLoader loader)
    {
        LOGGER.info("Classloader " + loader + ":");

        if (loader instanceof URLClassLoader)
        {
            URLClassLoader ucl = (URLClassLoader)loader;
            LOGGER.info(Arrays.toString(ucl.getURLs()));
        }
        else
            LOGGER.error("cannot display components as not a URLClassLoader)");

        if (loader.getParent() != null)
            dumpClasspath(loader.getParent());
    }
}

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>nl.work</groupId>
    <artifactId>KafkaStreamConsumer</artifactId>
    <version>1.0</version>
    <packaging>jar</packaging>
    <properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        <maven.compiler.source>1.7</maven.compiler.source>
        <maven.compiler.target>1.7</maven.compiler.target>
        <spark.version>1.3.1</spark.version>
        <hibernate.version>4.3.10.Final</hibernate.version>
        <phoenix.version>4.4.0-HBase-0.98</phoenix.version>
        <hbase.version>0.98.9-hadoop2</hbase.version>
        <spark-hbase.version>0.0.2-clabs-spark-1.3.1</spark-hbase.version>
    </properties>
    <dependencies>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.10</artifactId>
            <version>${spark.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming_2.10</artifactId>
            <version>${spark.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming-kafka_2.10</artifactId>
            <version>${spark.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.phoenix</groupId>
            <artifactId>phoenix-core</artifactId>
            <version>${phoenix.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.phoenix</groupId>
            <artifactId>phoenix-spark</artifactId>
            <version>${phoenix.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase-client</artifactId>
            <version>${hbase.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>com.cloudera</groupId>
            <artifactId>spark-hbase</artifactId>
            <version>${spark-hbase.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>4.10</version>
            <scope>test</scope>
        </dependency>
    </dependencies>
    <build>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.3</version>
                <configuration>
                    <source>${maven.compiler.source}</source>
                    <target>${maven.compiler.target}</target>
                </configuration>
            </plugin>
            <!-- <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-shade-plugin</artifactId> 
                <version>2.3</version> <executions> <execution> <phase>package</phase> <goals> 
                <goal>shade</goal> </goals> <configuration> <filters> <filter> <artifact>*:*</artifact> 
                <excludes> <exclude>META-INF/*.SF</exclude> <exclude>META-INF/*.DSA</exclude> 
                <exclude>META-INF/*.RSA</exclude> </excludes> </filter> </filters> </configuration> 
                </execution> </executions> </plugin> -->
        </plugins>
    </build>
    <repositories>
        <repository>
            <id>unknown-jars-temp-repo</id>
            <name>A temporary repository created by NetBeans for libraries and jars it could not identify. Please replace the dependencies in this repository with correct ones and delete this repository.</name>
            <url>file:${project.basedir}/lib</url>
        </repository>
    </repositories>
</project>

/ edit2我已经尝试过saveAsHadoopApiFile方法(https://gist.github.com/mravi/444afe7f49821819c987#file-phoenixsparkjob-java),但是会产生相同的错误,只是堆栈跟踪不同:

java.lang.RuntimeException: java.sql.SQLException: ERROR 103 (08004): Unable to establish connection.
        at org.apache.phoenix.mapreduce.PhoenixOutputFormat.getRecordWriter(PhoenixOutputFormat.java:58)
        at org.apache.spark.rdd.PairRDDFunctions$$anonfun$12.apply(PairRDDFunctions.scala:995)
        at org.apache.spark.rdd.PairRDDFunctions$$anonfun$12.apply(PairRDDFunctions.scala:979)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
        at org.apache.spark.scheduler.Task.run(Task.scala:64)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.sql.SQLException: ERROR 103 (08004): Unable to establish connection.
        at org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:386)
        at org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:145)
        at org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:288)
        at org.apache.phoenix.query.ConnectionQueryServicesImpl.access$300(ConnectionQueryServicesImpl.java:171)
        at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:1881)
        at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:1860)
        at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:77)
        at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1860)
        at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:162)
        at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:131)
        at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:133)
        at java.sql.DriverManager.getConnection(DriverManager.java:571)
        at java.sql.DriverManager.getConnection(DriverManager.java:187)
        at org.apache.phoenix.mapreduce.util.ConnectionUtil.getConnection(ConnectionUtil.java:92)
        at org.apache.phoenix.mapreduce.util.ConnectionUtil.getOutputConnection(ConnectionUtil.java:80)
        at org.apache.phoenix.mapreduce.util.ConnectionUtil.getOutputConnection(ConnectionUtil.java:68)
        at org.apache.phoenix.mapreduce.PhoenixRecordWriter.<init>(PhoenixRecordWriter.java:49)
        at org.apache.phoenix.mapreduce.PhoenixOutputFormat.getRecordWriter(PhoenixOutputFormat.java:55)
        ... 8 more
Caused by: java.io.IOException: java.lang.reflect.InvocationTargetException
        at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:457)
        at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:350)
        at org.apache.phoenix.query.HConnectionFactory$HConnectionFactoryImpl.createConnection(HConnectionFactory.java:47)
        at org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:286)
        ... 23 more
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:455)
        ... 26 more
Caused by: java.lang.UnsupportedOperationException: Unable to find org.apache.hadoop.hbase.ipc.controller.ClientRpcControllerFactory
        at org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:36)
        at org.apache.hadoop.hbase.ipc.RpcControllerFactory.instantiate(RpcControllerFactory.java:56)
        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:769)
        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:689)
        ... 31 more
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.ipc.controller.ClientRpcControllerFactory
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:191)
        at org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:32)
        ... 34 more
耶隆·弗莱克(Jeroen Vlek)

Phoenix邮件列表中的好人给了我答案:

“您可以将其包含在SPARK_CLASSPATH中的静态位置中,或者将其配置为以下静态配置值(我不使用SPARK_CLASSPATH,尽管已弃用),而不是将Phoenix客户JAR与您的应用程序捆绑在一起,而不是将其包含在您的应用程序中:spark.driver.extraClassPath spark .executor.extraClassPath“

https://www.mail-archive.com/[email protected]/msg29978.html

本文收集自互联网,转载请注明来源。

如有侵权,请联系[email protected] 删除。

编辑于
0

我来说两句

0条评论
登录后参与评论

相关文章

来自分类Dev

ValueError: 形状 (4,4) 和 (3,) 未对齐:4 (dim 1) != 3 (dim 0)

来自分类Dev

[0,1,2,3,4]和[[0],[1],[2],[3],[4]]和有什么区别?

来自分类Dev

创建序列0、0、0、0、0、1、1、1、1、1、2、2、2、2、2、3、3、3、3、3、4、4、4、4、4 ,带有seq()的4

来自分类Dev

使用Scala收集方法可帮助将[0,0,0,1,1,1,1,0,0,1,1]的列表转换为[3,4,2,2]

来自分类Dev

python 将列表 [0, 1, 2, 3, 4, 5] 转换为 [0, 1, 2], [1,2,3], [2,3,4]

来自分类Dev

给定一个张量 [5,4,3,4],如何生成一个常数张量,其中每行有 n 个 1 和 m 个零,n=5,4,3,4,m=0,1,2,1。

来自分类Dev

为什么Unix权限系统使用1 2 3 4 ...值而不是1或0?

来自分类Dev

下划线或lazy.js映射(0,1,2,3,4)+(1,2,3,4,5)->(1,3,5,7,9)

来自分类Dev

#<RSpec :: Core :: ExampleGroup :: Nested_1:0x0000000a4b3a98>的未定义方法`disallow_value'

来自分类Dev

给定2个向量A和B将A的每个元素与B的每个元素连接起来因此,例如,如果A:0 1 2 B:3 4 5是(0 3; 1 4; 2 5)

来自分类Dev

为什么map <bool,int> m = {{1,2,3,4,5,0}} 1号而不是3号?

来自分类Dev

从数组的末尾开始减去一个数字([1,2,3]-4 = [1,1,0])

来自分类Dev

从数组的末尾开始减去一个数字([1,2,3]-4 = [1,1,0])

来自分类Dev

将数组的索引0与1、2与3、4与5组合

来自分类Dev

如何将奇数索引转换为索引{0,1,2,3,4,5}?

来自分类Dev

将数组的元素添加为a [0],a [1] + a [2],a [3] + a [4] + a [5],a [6] + a [7] + a [8] + a [9] ...等等

来自分类Dev

C中{0,1,2,3,4,5,6,7,8,9}外的条件

来自分类Dev

什么是 pcap 幻数 0xc3d4a1b2?

来自分类Dev

ValueError:形状(1,1)和(4,1)未对齐:1(dim 1)!= 4(dim 0)

来自分类Dev

Lisp中的'(((1 2)(3 4)))和'('((1 2)'(3 4))之间有什么区别?

来自分类Dev

指针算术(char *)&a [1]-(char *)&a [0] == 4

来自分类Dev

我想把数组 [1,2,3,4,5,6,7,8,9,10,11] 变成数组 [1,2,3,4,5,6,7,8,9,1 ,0,1,1] 仅使用此算法

来自分类Dev

ValueError:无法为张量'image_tensor:0'提供形状((?,?,?,3)'的形状(1、233、472、4)值

来自分类Dev

VIM替换为$ 1,$ 2,$ 3和$ 4

来自分类Dev

R for回路向量1,2,2,3,3,3,4,4,4,4,..,10,

来自分类Dev

如何解释sybase RESTRICT运算符(VA = 1)(4)(0)(0)(0)(0)

来自分类Dev

清除算法以生成类型为(0)到(0,1,2,3,4,5,6,7,8,9)的所有集合

来自分类Dev

在ng-grid和firebase / angularFire上添加行时,有没有一种方法可以使用增量ID(例如0、1、2、3、4)?

来自分类Dev

为什么“数字&(〜(1 << 3))”对0无效?

来自分类Dev

EF Core 3 1对0关系问题

Related 相关文章

  1. 1

    ValueError: 形状 (4,4) 和 (3,) 未对齐:4 (dim 1) != 3 (dim 0)

  2. 2

    [0,1,2,3,4]和[[0],[1],[2],[3],[4]]和有什么区别?

  3. 3

    创建序列0、0、0、0、0、1、1、1、1、1、2、2、2、2、2、3、3、3、3、3、4、4、4、4、4 ,带有seq()的4

  4. 4

    使用Scala收集方法可帮助将[0,0,0,1,1,1,1,0,0,1,1]的列表转换为[3,4,2,2]

  5. 5

    python 将列表 [0, 1, 2, 3, 4, 5] 转换为 [0, 1, 2], [1,2,3], [2,3,4]

  6. 6

    给定一个张量 [5,4,3,4],如何生成一个常数张量,其中每行有 n 个 1 和 m 个零,n=5,4,3,4,m=0,1,2,1。

  7. 7

    为什么Unix权限系统使用1 2 3 4 ...值而不是1或0?

  8. 8

    下划线或lazy.js映射(0,1,2,3,4)+(1,2,3,4,5)->(1,3,5,7,9)

  9. 9

    #<RSpec :: Core :: ExampleGroup :: Nested_1:0x0000000a4b3a98>的未定义方法`disallow_value'

  10. 10

    给定2个向量A和B将A的每个元素与B的每个元素连接起来因此,例如,如果A:0 1 2 B:3 4 5是(0 3; 1 4; 2 5)

  11. 11

    为什么map <bool,int> m = {{1,2,3,4,5,0}} 1号而不是3号?

  12. 12

    从数组的末尾开始减去一个数字([1,2,3]-4 = [1,1,0])

  13. 13

    从数组的末尾开始减去一个数字([1,2,3]-4 = [1,1,0])

  14. 14

    将数组的索引0与1、2与3、4与5组合

  15. 15

    如何将奇数索引转换为索引{0,1,2,3,4,5}?

  16. 16

    将数组的元素添加为a [0],a [1] + a [2],a [3] + a [4] + a [5],a [6] + a [7] + a [8] + a [9] ...等等

  17. 17

    C中{0,1,2,3,4,5,6,7,8,9}外的条件

  18. 18

    什么是 pcap 幻数 0xc3d4a1b2?

  19. 19

    ValueError:形状(1,1)和(4,1)未对齐:1(dim 1)!= 4(dim 0)

  20. 20

    Lisp中的'(((1 2)(3 4)))和'('((1 2)'(3 4))之间有什么区别?

  21. 21

    指针算术(char *)&a [1]-(char *)&a [0] == 4

  22. 22

    我想把数组 [1,2,3,4,5,6,7,8,9,10,11] 变成数组 [1,2,3,4,5,6,7,8,9,1 ,0,1,1] 仅使用此算法

  23. 23

    ValueError:无法为张量'image_tensor:0'提供形状((?,?,?,3)'的形状(1、233、472、4)值

  24. 24

    VIM替换为$ 1,$ 2,$ 3和$ 4

  25. 25

    R for回路向量1,2,2,3,3,3,4,4,4,4,..,10,

  26. 26

    如何解释sybase RESTRICT运算符(VA = 1)(4)(0)(0)(0)(0)

  27. 27

    清除算法以生成类型为(0)到(0,1,2,3,4,5,6,7,8,9)的所有集合

  28. 28

    在ng-grid和firebase / angularFire上添加行时,有没有一种方法可以使用增量ID(例如0、1、2、3、4)?

  29. 29

    为什么“数字&(〜(1 << 3))”对0无效?

  30. 30

    EF Core 3 1对0关系问题

热门标签

归档